Nov 22 09:09:04 localhost kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 22 09:09:04 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 22 09:09:04 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 09:09:04 localhost kernel: BIOS-provided physical RAM map:
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 22 09:09:04 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 22 09:09:04 localhost kernel: NX (Execute Disable) protection: active
Nov 22 09:09:04 localhost kernel: APIC: Static calls initialized
Nov 22 09:09:04 localhost kernel: SMBIOS 2.8 present.
Nov 22 09:09:04 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 22 09:09:04 localhost kernel: Hypervisor detected: KVM
Nov 22 09:09:04 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 22 09:09:04 localhost kernel: kvm-clock: using sched offset of 9942425899 cycles
Nov 22 09:09:04 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 22 09:09:04 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 22 09:09:04 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 22 09:09:04 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 22 09:09:04 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 22 09:09:04 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 22 09:09:04 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 22 09:09:04 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 22 09:09:04 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 22 09:09:04 localhost kernel: Using GB pages for direct mapping
Nov 22 09:09:04 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 22 09:09:04 localhost kernel: ACPI: Early table checksum verification disabled
Nov 22 09:09:04 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 22 09:09:04 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 09:09:04 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 09:09:04 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 09:09:04 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 22 09:09:04 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 09:09:04 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 09:09:04 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 22 09:09:04 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 22 09:09:04 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 22 09:09:04 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 22 09:09:04 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 22 09:09:04 localhost kernel: No NUMA configuration found
Nov 22 09:09:04 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 22 09:09:04 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 22 09:09:04 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 22 09:09:04 localhost kernel: Zone ranges:
Nov 22 09:09:04 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 22 09:09:04 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 22 09:09:04 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 09:09:04 localhost kernel:   Device   empty
Nov 22 09:09:04 localhost kernel: Movable zone start for each node
Nov 22 09:09:04 localhost kernel: Early memory node ranges
Nov 22 09:09:04 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 22 09:09:04 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 22 09:09:04 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 09:09:04 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 22 09:09:04 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 22 09:09:04 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 22 09:09:04 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 22 09:09:04 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 22 09:09:04 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 22 09:09:04 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 22 09:09:04 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 22 09:09:04 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 22 09:09:04 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 22 09:09:04 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 22 09:09:04 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 22 09:09:04 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 22 09:09:04 localhost kernel: TSC deadline timer available
Nov 22 09:09:04 localhost kernel: CPU topo: Max. logical packages:   8
Nov 22 09:09:04 localhost kernel: CPU topo: Max. logical dies:       8
Nov 22 09:09:04 localhost kernel: CPU topo: Max. dies per package:   1
Nov 22 09:09:04 localhost kernel: CPU topo: Max. threads per core:   1
Nov 22 09:09:04 localhost kernel: CPU topo: Num. cores per package:     1
Nov 22 09:09:04 localhost kernel: CPU topo: Num. threads per package:   1
Nov 22 09:09:04 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 22 09:09:04 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 22 09:09:04 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 22 09:09:04 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 22 09:09:04 localhost kernel: Booting paravirtualized kernel on KVM
Nov 22 09:09:04 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 22 09:09:04 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 22 09:09:04 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 22 09:09:04 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 22 09:09:04 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 22 09:09:04 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 22 09:09:04 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 09:09:04 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 22 09:09:04 localhost kernel: random: crng init done
Nov 22 09:09:04 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 22 09:09:04 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 22 09:09:04 localhost kernel: Fallback order for Node 0: 0 
Nov 22 09:09:04 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 22 09:09:04 localhost kernel: Policy zone: Normal
Nov 22 09:09:04 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 22 09:09:04 localhost kernel: software IO TLB: area num 8.
Nov 22 09:09:04 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 22 09:09:04 localhost kernel: ftrace: allocating 49298 entries in 193 pages
Nov 22 09:09:04 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 22 09:09:04 localhost kernel: Dynamic Preempt: voluntary
Nov 22 09:09:04 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 22 09:09:04 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 22 09:09:04 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 22 09:09:04 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 22 09:09:04 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 22 09:09:04 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 22 09:09:04 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 22 09:09:04 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 22 09:09:04 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 09:09:04 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 09:09:04 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 09:09:04 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 22 09:09:04 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 22 09:09:04 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 22 09:09:04 localhost kernel: Console: colour VGA+ 80x25
Nov 22 09:09:04 localhost kernel: printk: console [ttyS0] enabled
Nov 22 09:09:04 localhost kernel: ACPI: Core revision 20230331
Nov 22 09:09:04 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 22 09:09:04 localhost kernel: x2apic enabled
Nov 22 09:09:04 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 22 09:09:04 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 22 09:09:04 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 22 09:09:04 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 22 09:09:04 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 22 09:09:04 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 22 09:09:04 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 22 09:09:04 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 22 09:09:04 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 22 09:09:04 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 22 09:09:04 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 22 09:09:04 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 22 09:09:04 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 22 09:09:04 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 22 09:09:04 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 22 09:09:04 localhost kernel: x86/bugs: return thunk changed
Nov 22 09:09:04 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 22 09:09:04 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 22 09:09:04 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 22 09:09:04 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 22 09:09:04 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 22 09:09:04 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 22 09:09:04 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 22 09:09:04 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 22 09:09:04 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 22 09:09:04 localhost kernel: landlock: Up and running.
Nov 22 09:09:04 localhost kernel: Yama: becoming mindful.
Nov 22 09:09:04 localhost kernel: SELinux:  Initializing.
Nov 22 09:09:04 localhost kernel: LSM support for eBPF active
Nov 22 09:09:04 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 09:09:04 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 09:09:04 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 22 09:09:04 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 22 09:09:04 localhost kernel: ... version:                0
Nov 22 09:09:04 localhost kernel: ... bit width:              48
Nov 22 09:09:04 localhost kernel: ... generic registers:      6
Nov 22 09:09:04 localhost kernel: ... value mask:             0000ffffffffffff
Nov 22 09:09:04 localhost kernel: ... max period:             00007fffffffffff
Nov 22 09:09:04 localhost kernel: ... fixed-purpose events:   0
Nov 22 09:09:04 localhost kernel: ... event mask:             000000000000003f
Nov 22 09:09:04 localhost kernel: signal: max sigframe size: 1776
Nov 22 09:09:04 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 22 09:09:04 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 22 09:09:04 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 22 09:09:04 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 22 09:09:04 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 22 09:09:04 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 22 09:09:04 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 22 09:09:04 localhost kernel: node 0 deferred pages initialised in 8ms
Nov 22 09:09:04 localhost kernel: Memory: 7765864K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616268K reserved, 0K cma-reserved)
Nov 22 09:09:04 localhost kernel: devtmpfs: initialized
Nov 22 09:09:04 localhost kernel: x86/mm: Memory block size: 128MB
Nov 22 09:09:04 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 22 09:09:04 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 22 09:09:04 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 22 09:09:04 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 22 09:09:04 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 22 09:09:04 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 22 09:09:04 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 22 09:09:04 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 22 09:09:04 localhost kernel: audit: type=2000 audit(1763802542.559:1): state=initialized audit_enabled=0 res=1
Nov 22 09:09:04 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 22 09:09:04 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 22 09:09:04 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 22 09:09:04 localhost kernel: cpuidle: using governor menu
Nov 22 09:09:04 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 22 09:09:04 localhost kernel: PCI: Using configuration type 1 for base access
Nov 22 09:09:04 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 22 09:09:04 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 22 09:09:04 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 22 09:09:04 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 22 09:09:04 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 22 09:09:04 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 22 09:09:04 localhost kernel: Demotion targets for Node 0: null
Nov 22 09:09:04 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 22 09:09:04 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 22 09:09:04 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 22 09:09:04 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 22 09:09:04 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 22 09:09:04 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 22 09:09:04 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 22 09:09:04 localhost kernel: ACPI: Interpreter enabled
Nov 22 09:09:04 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 22 09:09:04 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 22 09:09:04 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 22 09:09:04 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 22 09:09:04 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 22 09:09:04 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 22 09:09:04 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [3] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [4] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [5] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [6] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [7] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [8] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [9] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [10] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [11] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [12] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [13] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [14] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [15] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [16] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [17] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [18] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [19] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [20] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [21] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [22] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [23] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [24] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [25] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [26] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [27] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [28] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [29] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [30] registered
Nov 22 09:09:04 localhost kernel: acpiphp: Slot [31] registered
Nov 22 09:09:04 localhost kernel: PCI host bridge to bus 0000:00
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 22 09:09:04 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 22 09:09:04 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 22 09:09:04 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 22 09:09:04 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 22 09:09:04 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 22 09:09:04 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 22 09:09:04 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 22 09:09:04 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 22 09:09:04 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 22 09:09:04 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 22 09:09:04 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 09:09:04 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 22 09:09:04 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 22 09:09:04 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 22 09:09:04 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 22 09:09:04 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 22 09:09:04 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 22 09:09:04 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 22 09:09:04 localhost kernel: iommu: Default domain type: Translated
Nov 22 09:09:04 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 22 09:09:04 localhost kernel: SCSI subsystem initialized
Nov 22 09:09:04 localhost kernel: ACPI: bus type USB registered
Nov 22 09:09:04 localhost kernel: usbcore: registered new interface driver usbfs
Nov 22 09:09:04 localhost kernel: usbcore: registered new interface driver hub
Nov 22 09:09:04 localhost kernel: usbcore: registered new device driver usb
Nov 22 09:09:04 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 22 09:09:04 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 22 09:09:04 localhost kernel: PTP clock support registered
Nov 22 09:09:04 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 22 09:09:04 localhost kernel: NetLabel: Initializing
Nov 22 09:09:04 localhost kernel: NetLabel:  domain hash size = 128
Nov 22 09:09:04 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 22 09:09:04 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 22 09:09:04 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 22 09:09:04 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 22 09:09:04 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 22 09:09:04 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 22 09:09:04 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 22 09:09:04 localhost kernel: vgaarb: loaded
Nov 22 09:09:04 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 22 09:09:04 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 22 09:09:04 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 22 09:09:04 localhost kernel: pnp: PnP ACPI init
Nov 22 09:09:04 localhost kernel: pnp 00:03: [dma 2]
Nov 22 09:09:04 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 22 09:09:04 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 22 09:09:04 localhost kernel: NET: Registered PF_INET protocol family
Nov 22 09:09:04 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 22 09:09:04 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 22 09:09:04 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 22 09:09:04 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 22 09:09:04 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 22 09:09:04 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 22 09:09:04 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 22 09:09:04 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 09:09:04 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 09:09:04 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 22 09:09:04 localhost kernel: NET: Registered PF_XDP protocol family
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 22 09:09:04 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 22 09:09:04 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 22 09:09:04 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 22 09:09:04 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 84478 usecs
Nov 22 09:09:04 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 22 09:09:04 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 22 09:09:04 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 22 09:09:04 localhost kernel: ACPI: bus type thunderbolt registered
Nov 22 09:09:04 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 22 09:09:04 localhost kernel: Initialise system trusted keyrings
Nov 22 09:09:04 localhost kernel: Key type blacklist registered
Nov 22 09:09:04 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 22 09:09:04 localhost kernel: zbud: loaded
Nov 22 09:09:04 localhost kernel: integrity: Platform Keyring initialized
Nov 22 09:09:04 localhost kernel: integrity: Machine keyring initialized
Nov 22 09:09:04 localhost kernel: Freeing initrd memory: 85868K
Nov 22 09:09:04 localhost kernel: NET: Registered PF_ALG protocol family
Nov 22 09:09:04 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 22 09:09:04 localhost kernel: Key type asymmetric registered
Nov 22 09:09:04 localhost kernel: Asymmetric key parser 'x509' registered
Nov 22 09:09:04 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 22 09:09:04 localhost kernel: io scheduler mq-deadline registered
Nov 22 09:09:04 localhost kernel: io scheduler kyber registered
Nov 22 09:09:04 localhost kernel: io scheduler bfq registered
Nov 22 09:09:04 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 22 09:09:04 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 22 09:09:04 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 22 09:09:04 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 22 09:09:04 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 22 09:09:04 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 22 09:09:04 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 22 09:09:04 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 22 09:09:04 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 22 09:09:04 localhost kernel: Non-volatile memory driver v1.3
Nov 22 09:09:04 localhost kernel: rdac: device handler registered
Nov 22 09:09:04 localhost kernel: hp_sw: device handler registered
Nov 22 09:09:04 localhost kernel: emc: device handler registered
Nov 22 09:09:04 localhost kernel: alua: device handler registered
Nov 22 09:09:04 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 22 09:09:04 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 22 09:09:04 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 22 09:09:04 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 22 09:09:04 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 22 09:09:04 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 22 09:09:04 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 22 09:09:04 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 22 09:09:04 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 22 09:09:04 localhost kernel: hub 1-0:1.0: USB hub found
Nov 22 09:09:04 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 22 09:09:04 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 22 09:09:04 localhost kernel: usbserial: USB Serial support registered for generic
Nov 22 09:09:04 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 22 09:09:04 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 22 09:09:04 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 22 09:09:04 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 22 09:09:04 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 22 09:09:04 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 22 09:09:04 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 22 09:09:04 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-22T09:09:03 UTC (1763802543)
Nov 22 09:09:04 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 22 09:09:04 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 22 09:09:04 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 22 09:09:04 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 22 09:09:04 localhost kernel: usbcore: registered new interface driver usbhid
Nov 22 09:09:04 localhost kernel: usbhid: USB HID core driver
Nov 22 09:09:04 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 22 09:09:04 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 22 09:09:04 localhost kernel: Initializing XFRM netlink socket
Nov 22 09:09:04 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 22 09:09:04 localhost kernel: Segment Routing with IPv6
Nov 22 09:09:04 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 22 09:09:04 localhost kernel: mpls_gso: MPLS GSO support
Nov 22 09:09:04 localhost kernel: IPI shorthand broadcast: enabled
Nov 22 09:09:04 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 22 09:09:04 localhost kernel: AES CTR mode by8 optimization enabled
Nov 22 09:09:04 localhost kernel: sched_clock: Marking stable (1306004504, 152604825)->(1538028689, -79419360)
Nov 22 09:09:04 localhost kernel: registered taskstats version 1
Nov 22 09:09:04 localhost kernel: Loading compiled-in X.509 certificates
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 22 09:09:04 localhost kernel: Demotion targets for Node 0: null
Nov 22 09:09:04 localhost kernel: page_owner is disabled
Nov 22 09:09:04 localhost kernel: Key type .fscrypt registered
Nov 22 09:09:04 localhost kernel: Key type fscrypt-provisioning registered
Nov 22 09:09:04 localhost kernel: Key type big_key registered
Nov 22 09:09:04 localhost kernel: Key type encrypted registered
Nov 22 09:09:04 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 22 09:09:04 localhost kernel: Loading compiled-in module X.509 certificates
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 09:09:04 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 22 09:09:04 localhost kernel: ima: No architecture policies found
Nov 22 09:09:04 localhost kernel: evm: Initialising EVM extended attributes:
Nov 22 09:09:04 localhost kernel: evm: security.selinux
Nov 22 09:09:04 localhost kernel: evm: security.SMACK64 (disabled)
Nov 22 09:09:04 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 22 09:09:04 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 22 09:09:04 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 22 09:09:04 localhost kernel: evm: security.apparmor (disabled)
Nov 22 09:09:04 localhost kernel: evm: security.ima
Nov 22 09:09:04 localhost kernel: evm: security.capability
Nov 22 09:09:04 localhost kernel: evm: HMAC attrs: 0x1
Nov 22 09:09:04 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 22 09:09:04 localhost kernel: Running certificate verification RSA selftest
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 22 09:09:04 localhost kernel: Running certificate verification ECDSA selftest
Nov 22 09:09:04 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 22 09:09:04 localhost kernel: clk: Disabling unused clocks
Nov 22 09:09:04 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 22 09:09:04 localhost kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 22 09:09:04 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 22 09:09:04 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 22 09:09:04 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 22 09:09:04 localhost kernel: Run /init as init process
Nov 22 09:09:04 localhost kernel:   with arguments:
Nov 22 09:09:04 localhost kernel:     /init
Nov 22 09:09:04 localhost kernel:   with environment:
Nov 22 09:09:04 localhost kernel:     HOME=/
Nov 22 09:09:04 localhost kernel:     TERM=linux
Nov 22 09:09:04 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64
Nov 22 09:09:04 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 09:09:04 localhost systemd[1]: Detected virtualization kvm.
Nov 22 09:09:04 localhost systemd[1]: Detected architecture x86-64.
Nov 22 09:09:04 localhost systemd[1]: Running in initrd.
Nov 22 09:09:04 localhost systemd[1]: No hostname configured, using default hostname.
Nov 22 09:09:04 localhost systemd[1]: Hostname set to <localhost>.
Nov 22 09:09:04 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 22 09:09:04 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 22 09:09:04 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 22 09:09:04 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 22 09:09:04 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 22 09:09:04 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 22 09:09:04 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 22 09:09:04 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 22 09:09:04 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 22 09:09:04 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 09:09:04 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 22 09:09:04 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 22 09:09:04 localhost systemd[1]: Reached target Local File Systems.
Nov 22 09:09:04 localhost systemd[1]: Reached target Path Units.
Nov 22 09:09:04 localhost systemd[1]: Reached target Slice Units.
Nov 22 09:09:04 localhost systemd[1]: Reached target Swaps.
Nov 22 09:09:04 localhost systemd[1]: Reached target Timer Units.
Nov 22 09:09:04 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 22 09:09:04 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 22 09:09:04 localhost systemd[1]: Listening on Journal Socket.
Nov 22 09:09:04 localhost systemd[1]: Listening on udev Control Socket.
Nov 22 09:09:04 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 22 09:09:04 localhost systemd[1]: Reached target Socket Units.
Nov 22 09:09:04 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 22 09:09:04 localhost systemd[1]: Starting Journal Service...
Nov 22 09:09:04 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 09:09:04 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 22 09:09:04 localhost systemd[1]: Starting Create System Users...
Nov 22 09:09:04 localhost systemd[1]: Starting Setup Virtual Console...
Nov 22 09:09:04 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 22 09:09:04 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 22 09:09:04 localhost systemd[1]: Finished Create System Users.
Nov 22 09:09:04 localhost systemd-journald[306]: Journal started
Nov 22 09:09:04 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/451f2cde49d145fabcb27147a4a4b091) is 8.0M, max 153.6M, 145.6M free.
Nov 22 09:09:04 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 22 09:09:04 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 22 09:09:04 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 22 09:09:04 localhost systemd[1]: Started Journal Service.
Nov 22 09:09:04 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 22 09:09:04 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 09:09:04 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 09:09:04 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 09:09:04 localhost systemd[1]: Finished Setup Virtual Console.
Nov 22 09:09:04 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 22 09:09:04 localhost systemd[1]: Starting dracut cmdline hook...
Nov 22 09:09:04 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Nov 22 09:09:04 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 09:09:04 localhost systemd[1]: Finished dracut cmdline hook.
Nov 22 09:09:04 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 22 09:09:04 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 22 09:09:04 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 22 09:09:04 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 22 09:09:04 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 22 09:09:04 localhost kernel: RPC: Registered udp transport module.
Nov 22 09:09:04 localhost kernel: RPC: Registered tcp transport module.
Nov 22 09:09:04 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 22 09:09:04 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 22 09:09:04 localhost rpc.statd[442]: Version 2.5.4 starting
Nov 22 09:09:04 localhost rpc.statd[442]: Initializing NSM state
Nov 22 09:09:04 localhost rpc.idmapd[447]: Setting log level to 0
Nov 22 09:09:04 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 22 09:09:04 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 09:09:04 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 09:09:04 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 09:09:04 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 22 09:09:04 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 22 09:09:04 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 22 09:09:04 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 22 09:09:04 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 22 09:09:04 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 22 09:09:04 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 09:09:04 localhost systemd[1]: Reached target Network.
Nov 22 09:09:04 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 09:09:04 localhost systemd[1]: Starting dracut initqueue hook...
Nov 22 09:09:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 09:09:04 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 22 09:09:04 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 22 09:09:04 localhost kernel: libata version 3.00 loaded.
Nov 22 09:09:05 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 22 09:09:05 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 22 09:09:05 localhost kernel: scsi host0: ata_piix
Nov 22 09:09:05 localhost systemd-udevd[479]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 09:09:05 localhost kernel: scsi host1: ata_piix
Nov 22 09:09:05 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 22 09:09:05 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 22 09:09:05 localhost kernel:  vda: vda1
Nov 22 09:09:05 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 22 09:09:05 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 22 09:09:05 localhost kernel: ata1: found unknown device (class 0)
Nov 22 09:09:05 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 22 09:09:05 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 22 09:09:05 localhost systemd[1]: Reached target System Initialization.
Nov 22 09:09:05 localhost systemd[1]: Reached target Basic System.
Nov 22 09:09:05 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 22 09:09:05 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 22 09:09:05 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 22 09:09:05 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 22 09:09:05 localhost systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 09:09:05 localhost systemd[1]: Reached target Initrd Root Device.
Nov 22 09:09:05 localhost systemd[1]: Finished dracut initqueue hook.
Nov 22 09:09:05 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 09:09:05 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 22 09:09:05 localhost systemd[1]: Reached target Remote File Systems.
Nov 22 09:09:05 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 22 09:09:05 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 22 09:09:05 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 22 09:09:05 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Nov 22 09:09:05 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 09:09:05 localhost systemd[1]: Mounting /sysroot...
Nov 22 09:09:06 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 22 09:09:06 localhost kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 22 09:09:06 localhost kernel: XFS (vda1): Ending clean mount
Nov 22 09:09:06 localhost systemd[1]: Mounted /sysroot.
Nov 22 09:09:06 localhost systemd[1]: Reached target Initrd Root File System.
Nov 22 09:09:06 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 22 09:09:06 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 22 09:09:06 localhost systemd[1]: Reached target Initrd File Systems.
Nov 22 09:09:06 localhost systemd[1]: Reached target Initrd Default Target.
Nov 22 09:09:06 localhost systemd[1]: Starting dracut mount hook...
Nov 22 09:09:06 localhost systemd[1]: Finished dracut mount hook.
Nov 22 09:09:06 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 22 09:09:06 localhost rpc.idmapd[447]: exiting on signal 15
Nov 22 09:09:06 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 22 09:09:06 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 22 09:09:06 localhost systemd[1]: Stopped target Network.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Timer Units.
Nov 22 09:09:06 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 22 09:09:06 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Basic System.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Path Units.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Remote File Systems.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Slice Units.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Socket Units.
Nov 22 09:09:06 localhost systemd[1]: Stopped target System Initialization.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Local File Systems.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Swaps.
Nov 22 09:09:06 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped dracut mount hook.
Nov 22 09:09:06 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 22 09:09:06 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 22 09:09:06 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 22 09:09:06 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 22 09:09:06 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 22 09:09:06 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 22 09:09:06 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 22 09:09:06 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 22 09:09:06 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 22 09:09:06 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 22 09:09:06 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 22 09:09:06 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 22 09:09:06 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Closed udev Control Socket.
Nov 22 09:09:06 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Closed udev Kernel Socket.
Nov 22 09:09:06 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 22 09:09:06 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 22 09:09:06 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 22 09:09:07 localhost systemd[1]: Starting Cleanup udev Database...
Nov 22 09:09:07 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 22 09:09:07 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 22 09:09:07 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 22 09:09:07 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 22 09:09:07 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 22 09:09:07 localhost systemd[1]: Stopped Create System Users.
Nov 22 09:09:07 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 22 09:09:07 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 22 09:09:07 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 22 09:09:07 localhost systemd[1]: Finished Cleanup udev Database.
Nov 22 09:09:07 localhost systemd[1]: Reached target Switch Root.
Nov 22 09:09:07 localhost systemd[1]: Starting Switch Root...
Nov 22 09:09:07 localhost systemd[1]: Switching root.
Nov 22 09:09:07 localhost systemd-journald[306]: Journal stopped
Nov 22 09:09:08 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Nov 22 09:09:08 localhost kernel: audit: type=1404 audit(1763802547.426:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 22 09:09:08 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:09:08 localhost kernel: SELinux:  policy capability open_perms=1
Nov 22 09:09:08 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:09:08 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:09:08 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:09:08 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:09:08 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:09:08 localhost kernel: audit: type=1403 audit(1763802547.630:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 22 09:09:08 localhost systemd[1]: Successfully loaded SELinux policy in 209.387ms.
Nov 22 09:09:08 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.491ms.
Nov 22 09:09:08 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 09:09:08 localhost systemd[1]: Detected virtualization kvm.
Nov 22 09:09:08 localhost systemd[1]: Detected architecture x86-64.
Nov 22 09:09:08 localhost systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:09:08 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 22 09:09:08 localhost systemd[1]: Stopped Switch Root.
Nov 22 09:09:08 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 22 09:09:08 localhost systemd[1]: Created slice Slice /system/getty.
Nov 22 09:09:08 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 22 09:09:08 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 22 09:09:08 localhost systemd[1]: Created slice User and Session Slice.
Nov 22 09:09:08 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 09:09:08 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 22 09:09:08 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 22 09:09:08 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 22 09:09:08 localhost systemd[1]: Stopped target Switch Root.
Nov 22 09:09:08 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 22 09:09:08 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 22 09:09:08 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 22 09:09:08 localhost systemd[1]: Reached target Path Units.
Nov 22 09:09:08 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 22 09:09:08 localhost systemd[1]: Reached target Slice Units.
Nov 22 09:09:08 localhost systemd[1]: Reached target Swaps.
Nov 22 09:09:08 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 22 09:09:08 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 22 09:09:08 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 22 09:09:08 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 22 09:09:08 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 22 09:09:08 localhost systemd[1]: Listening on udev Control Socket.
Nov 22 09:09:08 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 22 09:09:08 localhost systemd[1]: Mounting Huge Pages File System...
Nov 22 09:09:08 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 22 09:09:08 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 22 09:09:08 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 22 09:09:08 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 09:09:08 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 22 09:09:08 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 22 09:09:08 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 22 09:09:08 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 22 09:09:08 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 22 09:09:08 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 22 09:09:08 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 22 09:09:08 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 22 09:09:08 localhost systemd[1]: Stopped Journal Service.
Nov 22 09:09:08 localhost systemd[1]: Starting Journal Service...
Nov 22 09:09:08 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 09:09:08 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 22 09:09:08 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 09:09:08 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 22 09:09:08 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 22 09:09:08 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 22 09:09:08 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 22 09:09:08 localhost kernel: fuse: init (API version 7.37)
Nov 22 09:09:08 localhost systemd[1]: Mounted Huge Pages File System.
Nov 22 09:09:08 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 22 09:09:08 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 22 09:09:08 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 22 09:09:08 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 22 09:09:08 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 09:09:08 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 22 09:09:08 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 22 09:09:08 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 22 09:09:08 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 22 09:09:08 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 22 09:09:08 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 22 09:09:08 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 22 09:09:08 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 22 09:09:08 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 22 09:09:08 localhost kernel: ACPI: bus type drm_connector registered
Nov 22 09:09:08 localhost systemd-journald[680]: Journal started
Nov 22 09:09:08 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 09:09:08 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 22 09:09:08 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 22 09:09:08 localhost systemd[1]: Mounting FUSE Control File System...
Nov 22 09:09:08 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 09:09:09 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 22 09:09:09 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 22 09:09:09 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 22 09:09:09 localhost systemd[1]: Starting Create System Users...
Nov 22 09:09:09 localhost systemd[1]: Started Journal Service.
Nov 22 09:09:09 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 22 09:09:09 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 22 09:09:09 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 22 09:09:09 localhost systemd[1]: Mounted FUSE Control File System.
Nov 22 09:09:09 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 22 09:09:09 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 22 09:09:09 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 09:09:09 localhost systemd-journald[680]: Received client request to flush runtime journal.
Nov 22 09:09:09 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 22 09:09:09 localhost systemd[1]: Finished Create System Users.
Nov 22 09:09:09 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 22 09:09:09 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 22 09:09:09 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 09:09:09 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 09:09:09 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 22 09:09:09 localhost systemd[1]: Reached target Local File Systems.
Nov 22 09:09:09 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 22 09:09:09 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 22 09:09:09 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 22 09:09:09 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 22 09:09:09 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 22 09:09:09 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 22 09:09:09 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 09:09:09 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Nov 22 09:09:09 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 22 09:09:10 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 09:09:10 localhost systemd[1]: Starting Security Auditing Service...
Nov 22 09:09:10 localhost systemd[1]: Starting RPC Bind...
Nov 22 09:09:10 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 22 09:09:10 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 22 09:09:10 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 22 09:09:10 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 22 09:09:10 localhost systemd[1]: Started RPC Bind.
Nov 22 09:09:10 localhost augenrules[708]: /sbin/augenrules: No change
Nov 22 09:09:10 localhost augenrules[724]: No rules
Nov 22 09:09:10 localhost augenrules[724]: enabled 1
Nov 22 09:09:10 localhost augenrules[724]: failure 1
Nov 22 09:09:10 localhost augenrules[724]: pid 703
Nov 22 09:09:10 localhost augenrules[724]: rate_limit 0
Nov 22 09:09:10 localhost augenrules[724]: backlog_limit 8192
Nov 22 09:09:10 localhost augenrules[724]: lost 0
Nov 22 09:09:10 localhost augenrules[724]: backlog 0
Nov 22 09:09:10 localhost augenrules[724]: backlog_wait_time 60000
Nov 22 09:09:10 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 22 09:09:10 localhost augenrules[724]: enabled 1
Nov 22 09:09:10 localhost augenrules[724]: failure 1
Nov 22 09:09:10 localhost augenrules[724]: pid 703
Nov 22 09:09:10 localhost augenrules[724]: rate_limit 0
Nov 22 09:09:10 localhost augenrules[724]: backlog_limit 8192
Nov 22 09:09:10 localhost augenrules[724]: lost 0
Nov 22 09:09:10 localhost augenrules[724]: backlog 4
Nov 22 09:09:10 localhost augenrules[724]: backlog_wait_time 60000
Nov 22 09:09:10 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 22 09:09:10 localhost augenrules[724]: enabled 1
Nov 22 09:09:10 localhost augenrules[724]: failure 1
Nov 22 09:09:10 localhost augenrules[724]: pid 703
Nov 22 09:09:10 localhost augenrules[724]: rate_limit 0
Nov 22 09:09:10 localhost augenrules[724]: backlog_limit 8192
Nov 22 09:09:10 localhost augenrules[724]: lost 0
Nov 22 09:09:10 localhost augenrules[724]: backlog 3
Nov 22 09:09:10 localhost augenrules[724]: backlog_wait_time 60000
Nov 22 09:09:10 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 22 09:09:10 localhost systemd[1]: Started Security Auditing Service.
Nov 22 09:09:10 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 22 09:09:10 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 22 09:09:10 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 22 09:09:10 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 09:09:10 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 09:09:11 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 09:09:11 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 22 09:09:11 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 09:09:11 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 22 09:09:11 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 22 09:09:11 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 22 09:09:11 localhost systemd-udevd[747]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 09:09:11 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 22 09:09:11 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 22 09:09:11 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 22 09:09:11 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 22 09:09:11 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 22 09:09:11 localhost kernel: Console: switching to colour dummy device 80x25
Nov 22 09:09:11 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 22 09:09:11 localhost kernel: [drm] features: -context_init
Nov 22 09:09:11 localhost kernel: [drm] number of scanouts: 1
Nov 22 09:09:11 localhost kernel: [drm] number of cap sets: 0
Nov 22 09:09:11 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 22 09:09:11 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 22 09:09:11 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 22 09:09:11 localhost kernel: kvm_amd: TSC scaling supported
Nov 22 09:09:11 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 22 09:09:11 localhost kernel: kvm_amd: Nested Paging enabled
Nov 22 09:09:11 localhost kernel: kvm_amd: LBR virtualization supported
Nov 22 09:09:11 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 22 09:09:11 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 22 09:09:11 localhost systemd[1]: Starting Update is Completed...
Nov 22 09:09:11 localhost systemd[1]: Finished Update is Completed.
Nov 22 09:09:11 localhost systemd[1]: Reached target System Initialization.
Nov 22 09:09:11 localhost systemd[1]: Started dnf makecache --timer.
Nov 22 09:09:11 localhost systemd[1]: Started Daily rotation of log files.
Nov 22 09:09:11 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 22 09:09:11 localhost systemd[1]: Reached target Timer Units.
Nov 22 09:09:11 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 22 09:09:11 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 22 09:09:11 localhost systemd[1]: Reached target Socket Units.
Nov 22 09:09:11 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 22 09:09:11 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 09:09:12 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 22 09:09:12 localhost systemd[1]: Reached target Basic System.
Nov 22 09:09:12 localhost dbus-broker-lau[812]: Ready
Nov 22 09:09:12 localhost systemd[1]: Starting NTP client/server...
Nov 22 09:09:12 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 22 09:09:12 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 22 09:09:12 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 22 09:09:12 localhost systemd[1]: Started irqbalance daemon.
Nov 22 09:09:12 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 22 09:09:12 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 09:09:12 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 09:09:12 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 09:09:12 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 22 09:09:12 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 22 09:09:12 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 22 09:09:12 localhost systemd[1]: Starting User Login Management...
Nov 22 09:09:12 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 22 09:09:12 localhost systemd-logind[819]: New seat seat0.
Nov 22 09:09:12 localhost systemd-logind[819]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 09:09:12 localhost systemd-logind[819]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 09:09:12 localhost systemd[1]: Started User Login Management.
Nov 22 09:09:12 localhost chronyd[827]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 09:09:12 localhost chronyd[827]: Loaded 0 symmetric keys
Nov 22 09:09:12 localhost chronyd[827]: Using right/UTC timezone to obtain leap second data
Nov 22 09:09:12 localhost chronyd[827]: Loaded seccomp filter (level 2)
Nov 22 09:09:12 localhost systemd[1]: Started NTP client/server.
Nov 22 09:09:12 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 22 09:09:12 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 22 09:09:12 localhost iptables.init[817]: iptables: Applying firewall rules: [  OK  ]
Nov 22 09:09:12 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 22 09:09:14 localhost cloud-init[840]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 22 Nov 2025 09:09:14 +0000. Up 11.86 seconds.
Nov 22 09:09:14 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 22 09:09:14 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 22 09:09:14 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp8dbg3u1u.mount: Deactivated successfully.
Nov 22 09:09:14 localhost systemd[1]: Starting Hostname Service...
Nov 22 09:09:14 localhost systemd[1]: Started Hostname Service.
Nov 22 09:09:14 np0005532133.novalocal systemd-hostnamed[855]: Hostname set to <np0005532133.novalocal> (static)
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Reached target Preparation for Network.
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Starting Network Manager...
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.2638] NetworkManager (version 1.54.1-1.el9) is starting... (boot:3931c0a4-baf1-4f89-bc19-8b6e9c477257)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.2643] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.2830] manager[0x559dcfbc0080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.2920] hostname: hostname: using hostnamed
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.2921] hostname: static hostname changed from (none) to "np0005532133.novalocal"
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.2926] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3036] manager[0x559dcfbc0080]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3038] manager[0x559dcfbc0080]: rfkill: WWAN hardware radio set enabled
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3336] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3337] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3337] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3338] manager: Networking is enabled by state file
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3340] settings: Loaded settings plugin: keyfile (internal)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3395] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3573] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3728] dhcp: init: Using DHCP client 'internal'
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3731] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3745] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3757] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3763] device (lo): Activation: starting connection 'lo' (6f9484ed-c385-45a5-b64f-d6d1d6763032)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3772] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3774] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.3804] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Started Network Manager.
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4043] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4046] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4048] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4049] device (eth0): carrier: link connected
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4052] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4056] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4061] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4065] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4066] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4067] manager: NetworkManager state is now CONNECTING
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4068] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4090] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4093] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Reached target Network.
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4161] dhcp4 (eth0): state changed new lease, address=38.129.56.220
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4170] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4190] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4449] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4452] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4461] device (lo): Activation: successful, device activated.
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4472] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4476] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4482] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4487] device (eth0): Activation: successful, device activated.
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4493] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 09:09:15 np0005532133.novalocal NetworkManager[859]: <info>  [1763802555.4498] manager: startup complete
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Reached target NFS client services.
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Reached target Remote File Systems.
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 22 09:09:15 np0005532133.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 22 Nov 2025 09:09:15 +0000. Up 13.55 seconds.
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |  eth0  | True |        38.129.56.220         | 255.255.255.0 | global | fa:16:3e:0a:c7:8a |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:fe0a:c78a/64 |       .       |  link  | fa:16:3e:0a:c7:8a |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 22 09:09:15 np0005532133.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 09:09:20 np0005532133.novalocal chronyd[827]: Selected source 206.108.0.132 (2.centos.pool.ntp.org)
Nov 22 09:09:20 np0005532133.novalocal chronyd[827]: System clock TAI offset set to 37 seconds
Nov 22 09:09:20 np0005532133.novalocal useradd[987]: new group: name=cloud-user, GID=1001
Nov 22 09:09:20 np0005532133.novalocal useradd[987]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 22 09:09:20 np0005532133.novalocal useradd[987]: add 'cloud-user' to group 'adm'
Nov 22 09:09:20 np0005532133.novalocal useradd[987]: add 'cloud-user' to group 'systemd-journal'
Nov 22 09:09:20 np0005532133.novalocal useradd[987]: add 'cloud-user' to shadow group 'adm'
Nov 22 09:09:20 np0005532133.novalocal useradd[987]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 35 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 35 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 33 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 33 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 31 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 28 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 34 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 34 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 32 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 30 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 22 09:09:22 np0005532133.novalocal irqbalance[818]: IRQ 29 affinity is now unmanaged
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Generating public/private rsa key pair.
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: The key fingerprint is:
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: SHA256:441vIJNEf8y1G8bGSmq47E3MbeEXZXjX78h0WaSXVCA root@np0005532133.novalocal
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: The key's randomart image is:
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: +---[RSA 3072]----+
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |            E .o+|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |      .     .o.oo|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |     . . o +..=.=|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |      . . = B+ o+|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |     . oS+.+.o..o|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |      =++*...+ + |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |     . =*.= . o .|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |      oo o..     |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |     .. ...      |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Generating public/private ecdsa key pair.
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: The key fingerprint is:
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: SHA256:qlLoG6ipqJi+auJlhUnVnQDdAsXNNQiZm9rZgLnEcsE root@np0005532133.novalocal
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: The key's randomart image is:
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: +---[ECDSA 256]---+
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |    .+B+X +o     |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |    .E B B  .    |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |   .. + +        |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |  ..o* +         |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |   ++.+ S        |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: | .. oo + .       |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |...+  .          |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |=+=. .           |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |^+oo.            |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Generating public/private ed25519 key pair.
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: The key fingerprint is:
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: SHA256:Twa1Dm65eTRe1AvR8A87C6ZcqWWxrq1l3DtPwC2nVOU root@np0005532133.novalocal
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: The key's randomart image is:
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: +--[ED25519 256]--+
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |          . oo  .|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |         . . +...|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |        o . + + E|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |       . = . * B |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |        S * X O +|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |       . X % + B |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |        o B = + .|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |         . =  .o |
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: |          o.. .o.|
Nov 22 09:09:22 np0005532133.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Reached target Network is Online.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting System Logging Service...
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting Permit User Sessions...
Nov 22 09:09:22 np0005532133.novalocal sm-notify[1003]: Version 2.5.4 starting
Nov 22 09:09:22 np0005532133.novalocal sshd[1005]: Server listening on 0.0.0.0 port 22.
Nov 22 09:09:22 np0005532133.novalocal sshd[1005]: Server listening on :: port 22.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Finished Permit User Sessions.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Started Command Scheduler.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Started Getty on tty1.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 22 09:09:22 np0005532133.novalocal crond[1008]: (CRON) STARTUP (1.5.7)
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Reached target Login Prompts.
Nov 22 09:09:22 np0005532133.novalocal crond[1008]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 22 09:09:22 np0005532133.novalocal crond[1008]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 40% if used.)
Nov 22 09:09:22 np0005532133.novalocal crond[1008]: (CRON) INFO (running with inotify support)
Nov 22 09:09:22 np0005532133.novalocal rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Nov 22 09:09:22 np0005532133.novalocal rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Started System Logging Service.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Reached target Multi-User System.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 22 09:09:22 np0005532133.novalocal rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:09:22 np0005532133.novalocal cloud-init[1064]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 22 Nov 2025 09:09:22 +0000. Up 20.51 seconds.
Nov 22 09:09:22 np0005532133.novalocal kdumpctl[1013]: kdump: No kdump initial ramdisk found.
Nov 22 09:09:22 np0005532133.novalocal kdumpctl[1013]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 22 09:09:22 np0005532133.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1206]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 22 Nov 2025 09:09:23 +0000. Up 20.90 seconds.
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1239]: #############################################################
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1243]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1249]: 256 SHA256:qlLoG6ipqJi+auJlhUnVnQDdAsXNNQiZm9rZgLnEcsE root@np0005532133.novalocal (ECDSA)
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1256]: 256 SHA256:Twa1Dm65eTRe1AvR8A87C6ZcqWWxrq1l3DtPwC2nVOU root@np0005532133.novalocal (ED25519)
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1259]: 3072 SHA256:441vIJNEf8y1G8bGSmq47E3MbeEXZXjX78h0WaSXVCA root@np0005532133.novalocal (RSA)
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1260]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1261]: #############################################################
Nov 22 09:09:23 np0005532133.novalocal cloud-init[1206]: Cloud-init v. 24.4-7.el9 finished at Sat, 22 Nov 2025 09:09:23 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 21.16 seconds
Nov 22 09:09:23 np0005532133.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 22 09:09:23 np0005532133.novalocal systemd[1]: Reached target Cloud-init target.
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1273]: Connection reset by 38.102.83.114 port 52812 [preauth]
Nov 22 09:09:23 np0005532133.novalocal dracut[1284]: dracut-057-102.git20250818.el9
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1285]: Unable to negotiate with 38.102.83.114 port 52822: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1292]: Unable to negotiate with 38.102.83.114 port 52832: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1294]: Unable to negotiate with 38.102.83.114 port 52846: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1290]: Connection closed by 38.102.83.114 port 52828 [preauth]
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1310]: Connection reset by 38.102.83.114 port 52868 [preauth]
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1312]: Unable to negotiate with 38.102.83.114 port 52880: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1314]: Unable to negotiate with 38.102.83.114 port 52882: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 22 09:09:23 np0005532133.novalocal sshd-session[1297]: Connection closed by 38.102.83.114 port 52854 [preauth]
Nov 22 09:09:23 np0005532133.novalocal dracut[1287]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 09:09:24 np0005532133.novalocal dracut[1287]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: memstrack is not available
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: memstrack is not available
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: *** Including module: systemd ***
Nov 22 09:09:25 np0005532133.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 09:09:25 np0005532133.novalocal dracut[1287]: *** Including module: fips ***
Nov 22 09:09:26 np0005532133.novalocal dracut[1287]: *** Including module: systemd-initrd ***
Nov 22 09:09:26 np0005532133.novalocal dracut[1287]: *** Including module: i18n ***
Nov 22 09:09:26 np0005532133.novalocal dracut[1287]: *** Including module: drm ***
Nov 22 09:09:26 np0005532133.novalocal dracut[1287]: *** Including module: prefixdevname ***
Nov 22 09:09:26 np0005532133.novalocal dracut[1287]: *** Including module: kernel-modules ***
Nov 22 09:09:26 np0005532133.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: kernel-modules-extra ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: qemu ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: fstab-sys ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: rootfs-block ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: terminfo ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: udev-rules ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: Skipping udev rule: 91-permissions.rules
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: virtiofs ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: dracut-systemd ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: usrmount ***
Nov 22 09:09:27 np0005532133.novalocal dracut[1287]: *** Including module: base ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Including module: fs-lib ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Including module: kdumpbase ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:   microcode_ctl module: mangling fw_dir
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Including module: openssl ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Including module: shutdown ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Including module: squash ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Including modules done ***
Nov 22 09:09:28 np0005532133.novalocal dracut[1287]: *** Installing kernel module dependencies ***
Nov 22 09:09:29 np0005532133.novalocal dracut[1287]: *** Installing kernel module dependencies done ***
Nov 22 09:09:29 np0005532133.novalocal dracut[1287]: *** Resolving executable dependencies ***
Nov 22 09:09:31 np0005532133.novalocal dracut[1287]: *** Resolving executable dependencies done ***
Nov 22 09:09:31 np0005532133.novalocal dracut[1287]: *** Generating early-microcode cpio image ***
Nov 22 09:09:31 np0005532133.novalocal dracut[1287]: *** Store current command line parameters ***
Nov 22 09:09:31 np0005532133.novalocal dracut[1287]: Stored kernel commandline:
Nov 22 09:09:31 np0005532133.novalocal dracut[1287]: No dracut internal kernel commandline stored in the initramfs
Nov 22 09:09:31 np0005532133.novalocal dracut[1287]: *** Install squash loader ***
Nov 22 09:09:32 np0005532133.novalocal dracut[1287]: *** Squashing the files inside the initramfs ***
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: *** Squashing the files inside the initramfs done ***
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: *** Hardlinking files ***
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: Mode:           real
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: Files:          50
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: Linked:         0 files
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: Compared:       0 xattrs
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: Compared:       0 files
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: Saved:          0 B
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: Duration:       0.000488 seconds
Nov 22 09:09:33 np0005532133.novalocal dracut[1287]: *** Hardlinking files done ***
Nov 22 09:09:34 np0005532133.novalocal dracut[1287]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 22 09:09:34 np0005532133.novalocal kdumpctl[1013]: kdump: kexec: loaded kdump kernel
Nov 22 09:09:34 np0005532133.novalocal kdumpctl[1013]: kdump: Starting kdump: [OK]
Nov 22 09:09:34 np0005532133.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 22 09:09:34 np0005532133.novalocal systemd[1]: Startup finished in 1.639s (kernel) + 3.545s (initrd) + 27.373s (userspace) = 32.559s.
Nov 22 09:09:45 np0005532133.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 09:09:45 np0005532133.novalocal sshd-session[4295]: Accepted publickey for zuul from 38.102.83.114 port 60772 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 22 09:09:45 np0005532133.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 22 09:09:45 np0005532133.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 22 09:09:45 np0005532133.novalocal systemd-logind[819]: New session 1 of user zuul.
Nov 22 09:09:45 np0005532133.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 22 09:09:45 np0005532133.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Queued start job for default target Main User Target.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Created slice User Application Slice.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Reached target Paths.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Reached target Timers.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Starting D-Bus User Message Bus Socket...
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Starting Create User's Volatile Files and Directories...
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Listening on D-Bus User Message Bus Socket.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Reached target Sockets.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Finished Create User's Volatile Files and Directories.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Reached target Basic System.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Reached target Main User Target.
Nov 22 09:09:45 np0005532133.novalocal systemd[4301]: Startup finished in 122ms.
Nov 22 09:09:45 np0005532133.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 22 09:09:45 np0005532133.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 22 09:09:45 np0005532133.novalocal sshd-session[4295]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:09:46 np0005532133.novalocal python3[4383]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:09:48 np0005532133.novalocal python3[4411]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:09:54 np0005532133.novalocal python3[4469]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:09:55 np0005532133.novalocal python3[4509]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 22 09:09:57 np0005532133.novalocal python3[4535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBiyRPfbgMvOokrOcWkVvvnErIQEMtTkf2WpCXS3XNCIUBViVQkL0TSG5NObTk+zhKDFWpAOHtde8lmLaoLncvCTtiqNMoRUHldY19Hv1kzgbltFa67/RhL5g9WD+TLBdqiYe2UcVHC7zp9X9E1AiR7LvVa5kvZaWWJN/YdeZnhJUPesxnK2WLf+juX+KOMuc/Q5HMcnE48DdNTrzeWoVzg86hzlqThUvO1C2eLlIhvHA3BtgATLtMDVHbMU+KOKE+9/SMqU2v3JS2m/lP1QL/2vSUF6CWxHVaN/7gWGEg1IqYVLZG0HEy25R3cTTVSLEOT1Oueokf8DFMlf5PeFeD3tA5gRRDuS1ZoWfpH71q6eSi/WjA8pgUxRgBH9LaGy7CTiM/dVQ79MbYp3y7p4AqUjq0OpAWGIZe/souLGzHkNXZRHQM5inf9YKrR0j9Kak6WEpsJZwdZMdEMP0J3RsqXici75PlaaOAeTVDVy7kF5sLWJYYBg7jR/dzczLdwd0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:09:57 np0005532133.novalocal python3[4559]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:09:58 np0005532133.novalocal python3[4658]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:09:58 np0005532133.novalocal python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763802597.7267778-207-21227659233062/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=2f68832295d840e2956409f0c16faf98_id_rsa follow=False checksum=43f57567b0cba8bf6655360ec35e98b728bb42ff backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:09:58 np0005532133.novalocal python3[4852]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:09:59 np0005532133.novalocal python3[4923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763802598.5991755-240-211957862303586/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=2f68832295d840e2956409f0c16faf98_id_rsa.pub follow=False checksum=422b3d9e02ea28f76cf53902c834e646800005be backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:00 np0005532133.novalocal python3[4971]: ansible-ping Invoked with data=pong
Nov 22 09:10:01 np0005532133.novalocal python3[4995]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:10:03 np0005532133.novalocal python3[5053]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 22 09:10:04 np0005532133.novalocal python3[5085]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:04 np0005532133.novalocal python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:04 np0005532133.novalocal python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:04 np0005532133.novalocal python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:05 np0005532133.novalocal python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:05 np0005532133.novalocal python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:06 np0005532133.novalocal sudo[5229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lopjllhfkqteivvulnglvyzayvoeqern ; /usr/bin/python3'
Nov 22 09:10:06 np0005532133.novalocal sudo[5229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:06 np0005532133.novalocal python3[5231]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:06 np0005532133.novalocal sudo[5229]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:07 np0005532133.novalocal sudo[5307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkffimeivdlzdmienhsdvpocjeaifglw ; /usr/bin/python3'
Nov 22 09:10:07 np0005532133.novalocal sudo[5307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:07 np0005532133.novalocal python3[5309]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:10:07 np0005532133.novalocal sudo[5307]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:07 np0005532133.novalocal sudo[5380]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqkilytxdmzpgtoermutccdbyxiidhcv ; /usr/bin/python3'
Nov 22 09:10:07 np0005532133.novalocal sudo[5380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:07 np0005532133.novalocal python3[5382]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763802606.9942572-21-12576541176347/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:07 np0005532133.novalocal sudo[5380]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:08 np0005532133.novalocal python3[5430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:08 np0005532133.novalocal python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:09 np0005532133.novalocal python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:09 np0005532133.novalocal python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:09 np0005532133.novalocal python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:09 np0005532133.novalocal python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:10 np0005532133.novalocal python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:10 np0005532133.novalocal python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:10 np0005532133.novalocal python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:10 np0005532133.novalocal python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:11 np0005532133.novalocal python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:11 np0005532133.novalocal python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:11 np0005532133.novalocal python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:12 np0005532133.novalocal python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:12 np0005532133.novalocal python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:12 np0005532133.novalocal python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:12 np0005532133.novalocal python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:13 np0005532133.novalocal python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:13 np0005532133.novalocal python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:13 np0005532133.novalocal python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:13 np0005532133.novalocal python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:14 np0005532133.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:14 np0005532133.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:14 np0005532133.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:15 np0005532133.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:15 np0005532133.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:10:18 np0005532133.novalocal sudo[6054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrnqecswalhoilmkafutqbordlegqarl ; /usr/bin/python3'
Nov 22 09:10:18 np0005532133.novalocal sudo[6054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:18 np0005532133.novalocal python3[6056]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 09:10:18 np0005532133.novalocal systemd[1]: Starting Time & Date Service...
Nov 22 09:10:18 np0005532133.novalocal systemd[1]: Started Time & Date Service.
Nov 22 09:10:18 np0005532133.novalocal systemd-timedated[6058]: Changed time zone to 'UTC' (UTC).
Nov 22 09:10:18 np0005532133.novalocal sudo[6054]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:18 np0005532133.novalocal sudo[6085]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqnnaspmemukrllfjaydvtisgijvkmbc ; /usr/bin/python3'
Nov 22 09:10:18 np0005532133.novalocal sudo[6085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:18 np0005532133.novalocal python3[6087]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:18 np0005532133.novalocal sudo[6085]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:19 np0005532133.novalocal python3[6163]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:10:19 np0005532133.novalocal python3[6234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763802618.871582-153-216014872632163/source _original_basename=tmpd9kmkeef follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:19 np0005532133.novalocal python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:10:20 np0005532133.novalocal python3[6405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763802619.7538729-183-209041473646622/source _original_basename=tmpbjmf_6w1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:20 np0005532133.novalocal sudo[6505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bumuvdaaqzyurngcuyikmbqiolphomlk ; /usr/bin/python3'
Nov 22 09:10:20 np0005532133.novalocal sudo[6505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:21 np0005532133.novalocal python3[6507]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:10:21 np0005532133.novalocal sudo[6505]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:21 np0005532133.novalocal sudo[6578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbndyzwoclhkbfmulidczsaidevvxtgj ; /usr/bin/python3'
Nov 22 09:10:21 np0005532133.novalocal sudo[6578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:21 np0005532133.novalocal python3[6580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763802620.8175669-231-197938163584806/source _original_basename=tmpuhk5fjay follow=False checksum=1276cb392a01956406f90ecf598575f86294c8dd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:21 np0005532133.novalocal sudo[6578]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:22 np0005532133.novalocal python3[6628]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:10:22 np0005532133.novalocal python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:10:22 np0005532133.novalocal sudo[6732]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hplrcryptcmqagliioktzusmdnqbvudg ; /usr/bin/python3'
Nov 22 09:10:22 np0005532133.novalocal sudo[6732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:22 np0005532133.novalocal python3[6734]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:10:22 np0005532133.novalocal sudo[6732]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:22 np0005532133.novalocal sudo[6805]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzajlzabmbelxwijzmwueomsfummjshj ; /usr/bin/python3'
Nov 22 09:10:22 np0005532133.novalocal sudo[6805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:23 np0005532133.novalocal python3[6807]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763802622.3988812-273-128320182105487/source _original_basename=tmpywqw_8_s follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:23 np0005532133.novalocal sudo[6805]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:23 np0005532133.novalocal sudo[6856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkuuesdldmarzlxcczuhkkdczbcuhmlc ; /usr/bin/python3'
Nov 22 09:10:23 np0005532133.novalocal sudo[6856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:23 np0005532133.novalocal python3[6858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-3d13-6167-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:10:23 np0005532133.novalocal sudo[6856]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:24 np0005532133.novalocal python3[6886]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-3d13-6167-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 22 09:10:25 np0005532133.novalocal python3[6914]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:47 np0005532133.novalocal sudo[6938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulgoesglwpbcooqppbjjvlptwusivibj ; /usr/bin/python3'
Nov 22 09:10:47 np0005532133.novalocal sudo[6938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:10:47 np0005532133.novalocal python3[6940]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:10:47 np0005532133.novalocal sudo[6938]: pam_unix(sudo:session): session closed for user root
Nov 22 09:10:48 np0005532133.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 22 09:11:23 np0005532133.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 22 09:11:23 np0005532133.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8126] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 09:11:23 np0005532133.novalocal systemd-udevd[6943]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8295] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8326] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8331] device (eth1): carrier: link connected
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8333] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8340] policy: auto-activating connection 'Wired connection 1' (2d6579af-6f98-3948-b6ff-43aec0679b8f)
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8343] device (eth1): Activation: starting connection 'Wired connection 1' (2d6579af-6f98-3948-b6ff-43aec0679b8f)
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8344] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8348] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8353] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:11:23 np0005532133.novalocal NetworkManager[859]: <info>  [1763802683.8357] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:11:24 np0005532133.novalocal python3[6970]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-3c3e-feba-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:11:31 np0005532133.novalocal sudo[7048]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytqcqvcxybkujiyiegbotdprkfmjsqnw ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 09:11:31 np0005532133.novalocal sudo[7048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:11:31 np0005532133.novalocal python3[7050]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:11:31 np0005532133.novalocal sudo[7048]: pam_unix(sudo:session): session closed for user root
Nov 22 09:11:31 np0005532133.novalocal sudo[7121]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isjuvtwvysstzpdozbqpykmvqbbygary ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 09:11:31 np0005532133.novalocal sudo[7121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:11:32 np0005532133.novalocal python3[7123]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763802691.4795272-102-267124436611824/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=55bbd5ee7c7cf463039f4fa9714dd5c56cd036c4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:11:32 np0005532133.novalocal sudo[7121]: pam_unix(sudo:session): session closed for user root
Nov 22 09:11:32 np0005532133.novalocal sudo[7171]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyppehuwhqpujmvfrqfgsdhdpuoriuua ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 09:11:32 np0005532133.novalocal sudo[7171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:11:32 np0005532133.novalocal python3[7173]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Stopping Network Manager...
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.8907] caught SIGTERM, shutting down normally.
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.8914] dhcp4 (eth0): canceled DHCP transaction
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.8914] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.8915] dhcp4 (eth0): state changed no lease
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.8918] manager: NetworkManager state is now CONNECTING
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.9056] dhcp4 (eth1): canceled DHCP transaction
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.9056] dhcp4 (eth1): state changed no lease
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[859]: <info>  [1763802692.9090] exiting (success)
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Stopped Network Manager.
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: NetworkManager.service: Consumed 1.044s CPU time, 9.9M memory peak.
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Starting Network Manager...
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802692.9642] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3931c0a4-baf1-4f89-bc19-8b6e9c477257)
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802692.9644] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 09:11:32 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802692.9700] manager[0x561f6c69d070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 09:11:32 np0005532133.novalocal systemd[1]: Starting Hostname Service...
Nov 22 09:11:33 np0005532133.novalocal systemd[1]: Started Hostname Service.
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0393] hostname: hostname: using hostnamed
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0396] hostname: static hostname changed from (none) to "np0005532133.novalocal"
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0402] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0406] manager[0x561f6c69d070]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0407] manager[0x561f6c69d070]: rfkill: WWAN hardware radio set enabled
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0431] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0432] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0432] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0432] manager: Networking is enabled by state file
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0434] settings: Loaded settings plugin: keyfile (internal)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0439] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0465] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0475] dhcp: init: Using DHCP client 'internal'
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0478] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0484] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0489] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0496] device (lo): Activation: starting connection 'lo' (6f9484ed-c385-45a5-b64f-d6d1d6763032)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0501] device (eth0): carrier: link connected
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0505] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0509] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0509] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0516] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0524] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0531] device (eth1): carrier: link connected
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0535] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0541] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (2d6579af-6f98-3948-b6ff-43aec0679b8f) (indicated)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0542] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0548] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0555] device (eth1): Activation: starting connection 'Wired connection 1' (2d6579af-6f98-3948-b6ff-43aec0679b8f)
Nov 22 09:11:33 np0005532133.novalocal systemd[1]: Started Network Manager.
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0569] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0575] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0577] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0580] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0582] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0586] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0588] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0590] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0593] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0610] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0613] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0620] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0622] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0635] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0640] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 09:11:33 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802693.0646] device (lo): Activation: successful, device activated.
Nov 22 09:11:33 np0005532133.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 22 09:11:33 np0005532133.novalocal sudo[7171]: pam_unix(sudo:session): session closed for user root
Nov 22 09:11:33 np0005532133.novalocal python3[7238]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-3c3e-feba-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1516] dhcp4 (eth0): state changed new lease, address=38.129.56.220
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1524] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1615] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1654] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1656] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1659] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1662] device (eth0): Activation: successful, device activated.
Nov 22 09:11:34 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802694.1667] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 09:11:44 np0005532133.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 09:12:03 np0005532133.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2315] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 09:12:18 np0005532133.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 09:12:18 np0005532133.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2650] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2653] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2659] device (eth1): Activation: successful, device activated.
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2663] manager: startup complete
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2665] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <warn>  [1763802738.2669] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2676] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2776] dhcp4 (eth1): canceled DHCP transaction
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2776] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2776] dhcp4 (eth1): state changed no lease
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2789] policy: auto-activating connection 'ci-private-network' (6f14d473-5f78-55da-8fa7-2b29b4ec1411)
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2792] device (eth1): Activation: starting connection 'ci-private-network' (6f14d473-5f78-55da-8fa7-2b29b4ec1411)
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2793] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2796] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2801] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2810] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2855] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2857] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:12:18 np0005532133.novalocal NetworkManager[7185]: <info>  [1763802738.2863] device (eth1): Activation: successful, device activated.
Nov 22 09:12:28 np0005532133.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 09:12:31 np0005532133.novalocal sudo[7360]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcflhsmtksnvinasuetnyesheawmvvlf ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 09:12:31 np0005532133.novalocal sudo[7360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:12:31 np0005532133.novalocal python3[7362]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:12:31 np0005532133.novalocal sudo[7360]: pam_unix(sudo:session): session closed for user root
Nov 22 09:12:31 np0005532133.novalocal sudo[7433]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oumyvgcxzbdzcvzgtefhlmttqxisrdwy ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 09:12:31 np0005532133.novalocal sudo[7433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:12:31 np0005532133.novalocal python3[7435]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763802751.028464-259-224677494856097/source _original_basename=tmpggqcwjnp follow=False checksum=800bc02b7f95c2cfc2ecefc5f8fa71c0e78ddf59 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:12:31 np0005532133.novalocal sudo[7433]: pam_unix(sudo:session): session closed for user root
Nov 22 09:12:40 np0005532133.novalocal systemd[4301]: Starting Mark boot as successful...
Nov 22 09:12:40 np0005532133.novalocal systemd[4301]: Finished Mark boot as successful.
Nov 22 09:13:31 np0005532133.novalocal sshd-session[4310]: Received disconnect from 38.102.83.114 port 60772:11: disconnected by user
Nov 22 09:13:31 np0005532133.novalocal sshd-session[4310]: Disconnected from user zuul 38.102.83.114 port 60772
Nov 22 09:13:31 np0005532133.novalocal sshd-session[4295]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:13:31 np0005532133.novalocal systemd-logind[819]: Session 1 logged out. Waiting for processes to exit.
Nov 22 09:15:40 np0005532133.novalocal systemd[4301]: Created slice User Background Tasks Slice.
Nov 22 09:15:40 np0005532133.novalocal systemd[4301]: Starting Cleanup of User's Temporary Files and Directories...
Nov 22 09:15:40 np0005532133.novalocal systemd[4301]: Finished Cleanup of User's Temporary Files and Directories.
Nov 22 09:17:49 np0005532133.novalocal sshd-session[7466]: Connection closed by 185.216.140.186 port 35536
Nov 22 09:18:28 np0005532133.novalocal sshd-session[7468]: Accepted publickey for zuul from 38.102.83.114 port 53942 ssh2: RSA SHA256:ZwJonFZsQftvUjpLgKei0MOmPha6rIt0QVmRZ5srg5s
Nov 22 09:18:28 np0005532133.novalocal systemd-logind[819]: New session 3 of user zuul.
Nov 22 09:18:28 np0005532133.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 22 09:18:28 np0005532133.novalocal sshd-session[7468]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:18:28 np0005532133.novalocal sudo[7495]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkywrhzmmcxdmeankenwudnbfbnawemy ; /usr/bin/python3'
Nov 22 09:18:28 np0005532133.novalocal sudo[7495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:28 np0005532133.novalocal python3[7497]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-4efb-a20f-000000001cc8-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:18:28 np0005532133.novalocal sudo[7495]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:28 np0005532133.novalocal sudo[7523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orxcnfbmdozdwabhjmuzkuelydndcfvt ; /usr/bin/python3'
Nov 22 09:18:28 np0005532133.novalocal sudo[7523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:29 np0005532133.novalocal python3[7525]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:18:29 np0005532133.novalocal sudo[7523]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:29 np0005532133.novalocal sudo[7549]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjwrpfdjiebxaurqizkxfqshmacnofip ; /usr/bin/python3'
Nov 22 09:18:29 np0005532133.novalocal sudo[7549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:29 np0005532133.novalocal python3[7551]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:18:29 np0005532133.novalocal sudo[7549]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:29 np0005532133.novalocal sudo[7576]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxmsypblvszhhbthtwjuinbjebmtlvxc ; /usr/bin/python3'
Nov 22 09:18:29 np0005532133.novalocal sudo[7576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:29 np0005532133.novalocal python3[7578]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:18:29 np0005532133.novalocal sudo[7576]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:29 np0005532133.novalocal sudo[7602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsijrdqwuxyqyqydchizjtvqlnmokgic ; /usr/bin/python3'
Nov 22 09:18:29 np0005532133.novalocal sudo[7602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:29 np0005532133.novalocal python3[7604]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:18:29 np0005532133.novalocal sudo[7602]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:30 np0005532133.novalocal sudo[7628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsjybkxkzikyokqwujvdxtpkptdicwzx ; /usr/bin/python3'
Nov 22 09:18:30 np0005532133.novalocal sudo[7628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:30 np0005532133.novalocal python3[7630]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:18:30 np0005532133.novalocal sudo[7628]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:30 np0005532133.novalocal sudo[7706]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebstguacnsklfqbfpxwvfbrtdprmhygl ; /usr/bin/python3'
Nov 22 09:18:30 np0005532133.novalocal sudo[7706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:30 np0005532133.novalocal python3[7708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:18:30 np0005532133.novalocal sudo[7706]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:31 np0005532133.novalocal sudo[7779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tydgbhjyhlzwzrenneixphqfegkcyzjn ; /usr/bin/python3'
Nov 22 09:18:31 np0005532133.novalocal sudo[7779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:31 np0005532133.novalocal python3[7781]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763803110.669347-473-170318352674814/source _original_basename=tmp5ns8by4m follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:18:31 np0005532133.novalocal sudo[7779]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:32 np0005532133.novalocal sudo[7829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhkuovhaaoisurdwlmztqiagtqbscobb ; /usr/bin/python3'
Nov 22 09:18:32 np0005532133.novalocal sudo[7829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:32 np0005532133.novalocal python3[7831]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:18:32 np0005532133.novalocal systemd[1]: Reloading.
Nov 22 09:18:32 np0005532133.novalocal systemd-rc-local-generator[7853]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:18:32 np0005532133.novalocal sudo[7829]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:33 np0005532133.novalocal sudo[7885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcluveciguoojkvsmjjooifwbiatneyr ; /usr/bin/python3'
Nov 22 09:18:33 np0005532133.novalocal sudo[7885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:34 np0005532133.novalocal python3[7887]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 22 09:18:34 np0005532133.novalocal sudo[7885]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:34 np0005532133.novalocal sudo[7911]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buaqhabxzjrfwkofpkbxzqnmtjhqlsaw ; /usr/bin/python3'
Nov 22 09:18:34 np0005532133.novalocal sudo[7911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:34 np0005532133.novalocal python3[7913]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:18:34 np0005532133.novalocal sudo[7911]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:34 np0005532133.novalocal sudo[7939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypgbxnbqmeensyzdmepskvqdipjtvxvg ; /usr/bin/python3'
Nov 22 09:18:34 np0005532133.novalocal sudo[7939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:34 np0005532133.novalocal python3[7941]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:18:34 np0005532133.novalocal sudo[7939]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:34 np0005532133.novalocal sudo[7967]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofbybjcdptzzqrbhuclnheehrsyqepoh ; /usr/bin/python3'
Nov 22 09:18:34 np0005532133.novalocal sudo[7967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:34 np0005532133.novalocal python3[7969]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:18:34 np0005532133.novalocal sudo[7967]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:35 np0005532133.novalocal sudo[7995]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykexoaknxtcdodcdlvleduisbxlhzwst ; /usr/bin/python3'
Nov 22 09:18:35 np0005532133.novalocal sudo[7995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:35 np0005532133.novalocal python3[7997]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:18:35 np0005532133.novalocal sudo[7995]: pam_unix(sudo:session): session closed for user root
Nov 22 09:18:35 np0005532133.novalocal python3[8024]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-4efb-a20f-000000001ccf-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:18:36 np0005532133.novalocal python3[8054]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 22 09:18:38 np0005532133.novalocal sshd-session[7471]: Connection closed by 38.102.83.114 port 53942
Nov 22 09:18:38 np0005532133.novalocal sshd-session[7468]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:18:38 np0005532133.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 22 09:18:38 np0005532133.novalocal systemd[1]: session-3.scope: Consumed 4.073s CPU time.
Nov 22 09:18:38 np0005532133.novalocal systemd-logind[819]: Session 3 logged out. Waiting for processes to exit.
Nov 22 09:18:38 np0005532133.novalocal systemd-logind[819]: Removed session 3.
Nov 22 09:18:39 np0005532133.novalocal sshd-session[8058]: Accepted publickey for zuul from 38.102.83.114 port 53160 ssh2: RSA SHA256:ZwJonFZsQftvUjpLgKei0MOmPha6rIt0QVmRZ5srg5s
Nov 22 09:18:39 np0005532133.novalocal systemd-logind[819]: New session 4 of user zuul.
Nov 22 09:18:39 np0005532133.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 22 09:18:39 np0005532133.novalocal sshd-session[8058]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:18:39 np0005532133.novalocal sudo[8085]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbtlgqvbnzmjtlbyjcflzhrdtgnagoeh ; /usr/bin/python3'
Nov 22 09:18:39 np0005532133.novalocal sudo[8085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:18:40 np0005532133.novalocal python3[8087]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:18:55 np0005532133.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:19:05 np0005532133.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:19:16 np0005532133.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:19:17 np0005532133.novalocal setsebool[8154]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 22 09:19:17 np0005532133.novalocal setsebool[8154]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:19:30 np0005532133.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:19:49 np0005532133.novalocal dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 09:19:49 np0005532133.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:19:49 np0005532133.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:19:49 np0005532133.novalocal systemd[1]: Reloading.
Nov 22 09:19:49 np0005532133.novalocal systemd-rc-local-generator[8908]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:19:49 np0005532133.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:19:51 np0005532133.novalocal sudo[8085]: pam_unix(sudo:session): session closed for user root
Nov 22 09:19:55 np0005532133.novalocal python3[12836]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-351a-59a6-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:19:55 np0005532133.novalocal kernel: evm: overlay not supported
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: Starting D-Bus User Message Bus...
Nov 22 09:19:56 np0005532133.novalocal dbus-broker-launch[13653]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 22 09:19:56 np0005532133.novalocal dbus-broker-launch[13653]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: Started D-Bus User Message Bus.
Nov 22 09:19:56 np0005532133.novalocal dbus-broker-lau[13653]: Ready
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: Created slice Slice /user.
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: podman-13534.scope: unit configures an IP firewall, but not running as root.
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: (This warning is only shown for the first unit using IP firewalling.)
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: Started podman-13534.scope.
Nov 22 09:19:56 np0005532133.novalocal systemd[4301]: Started podman-pause-2d847339.scope.
Nov 22 09:19:57 np0005532133.novalocal sudo[14110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdjufunbpotnunjbshxbrfyubcvwfpwm ; /usr/bin/python3'
Nov 22 09:19:57 np0005532133.novalocal sudo[14110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:19:57 np0005532133.novalocal python3[14124]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.32:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.32:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:19:57 np0005532133.novalocal python3[14124]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 22 09:19:57 np0005532133.novalocal sudo[14110]: pam_unix(sudo:session): session closed for user root
Nov 22 09:19:58 np0005532133.novalocal sshd-session[8061]: Connection closed by 38.102.83.114 port 53160
Nov 22 09:19:58 np0005532133.novalocal sshd-session[8058]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:19:58 np0005532133.novalocal systemd-logind[819]: Session 4 logged out. Waiting for processes to exit.
Nov 22 09:19:58 np0005532133.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 22 09:19:58 np0005532133.novalocal systemd[1]: session-4.scope: Consumed 1min 6.299s CPU time.
Nov 22 09:19:58 np0005532133.novalocal systemd-logind[819]: Removed session 4.
Nov 22 09:20:18 np0005532133.novalocal sshd-session[21736]: Unable to negotiate with 38.102.83.144 port 34332: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 22 09:20:18 np0005532133.novalocal sshd-session[21734]: Connection closed by 38.102.83.144 port 34292 [preauth]
Nov 22 09:20:18 np0005532133.novalocal sshd-session[21738]: Connection closed by 38.102.83.144 port 34308 [preauth]
Nov 22 09:20:18 np0005532133.novalocal sshd-session[21742]: Unable to negotiate with 38.102.83.144 port 34310: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 22 09:20:18 np0005532133.novalocal sshd-session[21741]: Unable to negotiate with 38.102.83.144 port 34324: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 22 09:20:22 np0005532133.novalocal sshd-session[23439]: Accepted publickey for zuul from 38.102.83.114 port 41424 ssh2: RSA SHA256:ZwJonFZsQftvUjpLgKei0MOmPha6rIt0QVmRZ5srg5s
Nov 22 09:20:22 np0005532133.novalocal systemd-logind[819]: New session 5 of user zuul.
Nov 22 09:20:22 np0005532133.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 22 09:20:22 np0005532133.novalocal sshd-session[23439]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:20:22 np0005532133.novalocal python3[23531]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN5R4ZsbJDUo8gHi5nJtEj6CeSHo5+knZKEaO5OFjiyFKwGLa+MpoXJc/bDPCNs/0APEJHmiJmCBVgu4SYwJMRI= zuul@np0005532132.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:20:23 np0005532133.novalocal sudo[23686]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrtxetmfxsiyozhowofshmgyjypyvhtw ; /usr/bin/python3'
Nov 22 09:20:23 np0005532133.novalocal sudo[23686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:20:23 np0005532133.novalocal python3[23696]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN5R4ZsbJDUo8gHi5nJtEj6CeSHo5+knZKEaO5OFjiyFKwGLa+MpoXJc/bDPCNs/0APEJHmiJmCBVgu4SYwJMRI= zuul@np0005532132.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:20:23 np0005532133.novalocal sudo[23686]: pam_unix(sudo:session): session closed for user root
Nov 22 09:20:24 np0005532133.novalocal sudo[23925]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bodyifnviazzhrlqvvgsozumtfohxxqz ; /usr/bin/python3'
Nov 22 09:20:24 np0005532133.novalocal sudo[23925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:20:24 np0005532133.novalocal python3[23934]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532133.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 22 09:20:24 np0005532133.novalocal useradd[24024]: new group: name=cloud-admin, GID=1002
Nov 22 09:20:24 np0005532133.novalocal useradd[24024]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 22 09:20:24 np0005532133.novalocal sudo[23925]: pam_unix(sudo:session): session closed for user root
Nov 22 09:20:24 np0005532133.novalocal sudo[24143]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqdtbqcgmzwlobahifgcltnsafrximul ; /usr/bin/python3'
Nov 22 09:20:24 np0005532133.novalocal sudo[24143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:20:24 np0005532133.novalocal python3[24150]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN5R4ZsbJDUo8gHi5nJtEj6CeSHo5+knZKEaO5OFjiyFKwGLa+MpoXJc/bDPCNs/0APEJHmiJmCBVgu4SYwJMRI= zuul@np0005532132.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 09:20:24 np0005532133.novalocal sudo[24143]: pam_unix(sudo:session): session closed for user root
Nov 22 09:20:24 np0005532133.novalocal sudo[24379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzswfwqztjytgjbqqgoeehhjysnebwvv ; /usr/bin/python3'
Nov 22 09:20:24 np0005532133.novalocal sudo[24379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:20:25 np0005532133.novalocal python3[24386]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:20:25 np0005532133.novalocal sudo[24379]: pam_unix(sudo:session): session closed for user root
Nov 22 09:20:25 np0005532133.novalocal sudo[24640]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zufkjpvtzayodamqexxfwrjormdrvzel ; /usr/bin/python3'
Nov 22 09:20:25 np0005532133.novalocal sudo[24640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:20:25 np0005532133.novalocal python3[24646]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763803224.8672457-135-121767468418138/source _original_basename=tmpk0z4i6y2 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:20:25 np0005532133.novalocal sudo[24640]: pam_unix(sudo:session): session closed for user root
Nov 22 09:20:26 np0005532133.novalocal sudo[24939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlpfhglrtgwemqotwqiuggknrwyabcsw ; /usr/bin/python3'
Nov 22 09:20:26 np0005532133.novalocal sudo[24939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:20:26 np0005532133.novalocal python3[24946]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 22 09:20:26 np0005532133.novalocal systemd[1]: Starting Hostname Service...
Nov 22 09:20:26 np0005532133.novalocal systemd[1]: Started Hostname Service.
Nov 22 09:20:26 np0005532133.novalocal systemd-hostnamed[25062]: Changed pretty hostname to 'compute-0'
Nov 22 09:20:26 compute-0 systemd-hostnamed[25062]: Hostname set to <compute-0> (static)
Nov 22 09:20:26 compute-0 NetworkManager[7185]: <info>  [1763803226.6238] hostname: static hostname changed from "np0005532133.novalocal" to "compute-0"
Nov 22 09:20:26 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 09:20:26 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 09:20:26 compute-0 sudo[24939]: pam_unix(sudo:session): session closed for user root
Nov 22 09:20:27 compute-0 sshd-session[23478]: Connection closed by 38.102.83.114 port 41424
Nov 22 09:20:27 compute-0 sshd-session[23439]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:20:27 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Nov 22 09:20:27 compute-0 systemd[1]: session-5.scope: Consumed 2.370s CPU time.
Nov 22 09:20:27 compute-0 systemd-logind[819]: Session 5 logged out. Waiting for processes to exit.
Nov 22 09:20:27 compute-0 systemd-logind[819]: Removed session 5.
Nov 22 09:20:36 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 09:20:39 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:20:39 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:20:39 compute-0 systemd[1]: man-db-cache-update.service: Consumed 58.270s CPU time.
Nov 22 09:20:39 compute-0 systemd[1]: run-rfbea678d59224027bb2c9804517a5c59.service: Deactivated successfully.
Nov 22 09:20:56 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 09:24:40 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 22 09:24:40 compute-0 systemd[1]: Starting dnf makecache...
Nov 22 09:24:40 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 22 09:24:40 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 22 09:24:40 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 22 09:24:40 compute-0 dnf[29913]: Failed determining last makecache time.
Nov 22 09:24:41 compute-0 dnf[29913]: CentOS Stream 9 - BaseOS                         25 kB/s | 7.3 kB     00:00
Nov 22 09:24:41 compute-0 dnf[29913]: CentOS Stream 9 - AppStream                      30 kB/s | 7.4 kB     00:00
Nov 22 09:24:41 compute-0 dnf[29913]: CentOS Stream 9 - CRB                            46 kB/s | 7.2 kB     00:00
Nov 22 09:24:42 compute-0 dnf[29913]: CentOS Stream 9 - Extras packages                26 kB/s | 8.3 kB     00:00
Nov 22 09:24:42 compute-0 dnf[29913]: Metadata cache created.
Nov 22 09:24:42 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 09:24:42 compute-0 systemd[1]: Finished dnf makecache.
Nov 22 09:25:14 compute-0 sshd-session[29919]: Accepted publickey for zuul from 38.102.83.144 port 39556 ssh2: RSA SHA256:ZwJonFZsQftvUjpLgKei0MOmPha6rIt0QVmRZ5srg5s
Nov 22 09:25:14 compute-0 systemd-logind[819]: New session 6 of user zuul.
Nov 22 09:25:14 compute-0 systemd[1]: Started Session 6 of User zuul.
Nov 22 09:25:14 compute-0 sshd-session[29919]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:25:15 compute-0 python3[29995]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:25:16 compute-0 sudo[30109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hloijrxsmkdczzqcckhekcjizjipmzdk ; /usr/bin/python3'
Nov 22 09:25:16 compute-0 sudo[30109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:16 compute-0 python3[30111]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:25:16 compute-0 sudo[30109]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:17 compute-0 sudo[30183]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlafcyivgklfxopcejhqoshhedtzxpjr ; /usr/bin/python3'
Nov 22 09:25:17 compute-0 sudo[30183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:17 compute-0 python3[30185]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763803516.545449-33537-160156619845421/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:25:17 compute-0 sudo[30183]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:17 compute-0 sudo[30209]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkonivqhlrwenkccarfjlozetflquviv ; /usr/bin/python3'
Nov 22 09:25:17 compute-0 sudo[30209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:17 compute-0 python3[30211]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:25:17 compute-0 sudo[30209]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:17 compute-0 sudo[30282]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blxyweaupzzsqzakdxxhsesbebwcseth ; /usr/bin/python3'
Nov 22 09:25:17 compute-0 sudo[30282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:17 compute-0 python3[30284]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763803516.545449-33537-160156619845421/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:25:17 compute-0 sudo[30282]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:18 compute-0 sudo[30308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdckppfgsbyosstugirheaopuxatkfxh ; /usr/bin/python3'
Nov 22 09:25:18 compute-0 sudo[30308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:18 compute-0 python3[30310]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:25:18 compute-0 sudo[30308]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:18 compute-0 sudo[30381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vizrttnssenipvcxzyhxfvpupwhhvbqd ; /usr/bin/python3'
Nov 22 09:25:18 compute-0 sudo[30381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:18 compute-0 python3[30383]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763803516.545449-33537-160156619845421/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:25:18 compute-0 sudo[30381]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:18 compute-0 sudo[30407]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noqyujpiuwzibmqsjhjvcrhlylzhtzwu ; /usr/bin/python3'
Nov 22 09:25:18 compute-0 sudo[30407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:18 compute-0 python3[30409]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:25:18 compute-0 sudo[30407]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:19 compute-0 sudo[30480]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfzwglqhcnpuidozhsmyasjsqnsakixs ; /usr/bin/python3'
Nov 22 09:25:19 compute-0 sudo[30480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:19 compute-0 python3[30482]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763803516.545449-33537-160156619845421/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:25:19 compute-0 sudo[30480]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:19 compute-0 sudo[30506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auhlgsqzixfgbgplgqwxrpphmlgpjlzs ; /usr/bin/python3'
Nov 22 09:25:19 compute-0 sudo[30506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:19 compute-0 python3[30508]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:25:19 compute-0 sudo[30506]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:19 compute-0 sudo[30579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcxshqcnbklrelovlxkfxbmrcgrttgtd ; /usr/bin/python3'
Nov 22 09:25:19 compute-0 sudo[30579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:19 compute-0 python3[30581]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763803516.545449-33537-160156619845421/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:25:19 compute-0 sudo[30579]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:19 compute-0 sudo[30605]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbdnclwqmricvfonhsmxvrrmqrmeitpq ; /usr/bin/python3'
Nov 22 09:25:19 compute-0 sudo[30605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:20 compute-0 python3[30607]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:25:20 compute-0 sudo[30605]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:20 compute-0 sudo[30678]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llhkpfkcmjioqsgngfpqgnfrurmpgsfq ; /usr/bin/python3'
Nov 22 09:25:20 compute-0 sudo[30678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:20 compute-0 python3[30680]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763803516.545449-33537-160156619845421/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:25:20 compute-0 sudo[30678]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:20 compute-0 sudo[30704]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sibsymvyygfostsvmtnaewbxavgfauie ; /usr/bin/python3'
Nov 22 09:25:20 compute-0 sudo[30704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:20 compute-0 python3[30706]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 09:25:20 compute-0 sudo[30704]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:20 compute-0 sudo[30777]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amjhjnlxvgidyizexcwrosyhzbjbxhxk ; /usr/bin/python3'
Nov 22 09:25:20 compute-0 sudo[30777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:25:21 compute-0 python3[30779]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763803516.545449-33537-160156619845421/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:25:21 compute-0 sudo[30777]: pam_unix(sudo:session): session closed for user root
Nov 22 09:25:23 compute-0 sshd-session[30804]: Connection closed by 192.168.122.11 port 51942 [preauth]
Nov 22 09:25:23 compute-0 sshd-session[30806]: Connection closed by 192.168.122.11 port 51940 [preauth]
Nov 22 09:25:23 compute-0 sshd-session[30807]: Unable to negotiate with 192.168.122.11 port 51946: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 22 09:25:23 compute-0 sshd-session[30805]: Unable to negotiate with 192.168.122.11 port 51960: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 22 09:25:23 compute-0 sshd-session[30808]: Unable to negotiate with 192.168.122.11 port 51952: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 22 09:25:32 compute-0 python3[30837]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:30:32 compute-0 sshd-session[29922]: Received disconnect from 38.102.83.144 port 39556:11: disconnected by user
Nov 22 09:30:32 compute-0 sshd-session[29922]: Disconnected from user zuul 38.102.83.144 port 39556
Nov 22 09:30:32 compute-0 sshd-session[29919]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:30:32 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 22 09:30:32 compute-0 systemd[1]: session-6.scope: Consumed 4.700s CPU time.
Nov 22 09:30:32 compute-0 systemd-logind[819]: Session 6 logged out. Waiting for processes to exit.
Nov 22 09:30:32 compute-0 systemd-logind[819]: Removed session 6.
Nov 22 09:36:47 compute-0 sshd-session[30844]: Accepted publickey for zuul from 192.168.122.30 port 35892 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:36:47 compute-0 systemd-logind[819]: New session 7 of user zuul.
Nov 22 09:36:47 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 22 09:36:47 compute-0 sshd-session[30844]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:36:48 compute-0 python3.9[30997]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:36:50 compute-0 sudo[31176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiomzppjltzonutprdvvpbxgzmyugwoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804209.526689-32-267331878119228/AnsiballZ_command.py'
Nov 22 09:36:50 compute-0 sudo[31176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:36:50 compute-0 python3.9[31178]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:36:57 compute-0 sudo[31176]: pam_unix(sudo:session): session closed for user root
Nov 22 09:36:58 compute-0 sshd-session[30847]: Connection closed by 192.168.122.30 port 35892
Nov 22 09:36:58 compute-0 sshd-session[30844]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:36:58 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 22 09:36:58 compute-0 systemd[1]: session-7.scope: Consumed 8.123s CPU time.
Nov 22 09:36:58 compute-0 systemd-logind[819]: Session 7 logged out. Waiting for processes to exit.
Nov 22 09:36:58 compute-0 systemd-logind[819]: Removed session 7.
Nov 22 09:37:04 compute-0 sshd-session[31235]: Accepted publickey for zuul from 192.168.122.30 port 47104 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:37:04 compute-0 systemd-logind[819]: New session 8 of user zuul.
Nov 22 09:37:04 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 22 09:37:04 compute-0 sshd-session[31235]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:37:05 compute-0 python3.9[31388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:37:06 compute-0 sshd-session[31238]: Connection closed by 192.168.122.30 port 47104
Nov 22 09:37:06 compute-0 sshd-session[31235]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:37:06 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 22 09:37:06 compute-0 systemd-logind[819]: Session 8 logged out. Waiting for processes to exit.
Nov 22 09:37:06 compute-0 systemd-logind[819]: Removed session 8.
Nov 22 09:37:22 compute-0 sshd-session[31418]: Accepted publickey for zuul from 192.168.122.30 port 53570 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:37:22 compute-0 systemd-logind[819]: New session 9 of user zuul.
Nov 22 09:37:22 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 22 09:37:22 compute-0 sshd-session[31418]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:37:23 compute-0 python3.9[31571]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 22 09:37:24 compute-0 python3.9[31745]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:37:25 compute-0 sudo[31895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-livohehalyivinkbjgmaugxtzoictnjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804244.7532985-45-267631110440817/AnsiballZ_command.py'
Nov 22 09:37:25 compute-0 sudo[31895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:25 compute-0 python3.9[31897]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:37:25 compute-0 sudo[31895]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:26 compute-0 sudo[32048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljblitbuuwjssksljldkbhjwooqfmema ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804245.8617163-57-170391548728498/AnsiballZ_stat.py'
Nov 22 09:37:26 compute-0 sudo[32048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:26 compute-0 python3.9[32050]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:37:26 compute-0 sudo[32048]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:27 compute-0 sudo[32200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnhyfvviredswhapjxgtbzapvpxwnuab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804246.7887604-65-42076408605418/AnsiballZ_file.py'
Nov 22 09:37:27 compute-0 sudo[32200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:27 compute-0 python3.9[32202]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:37:27 compute-0 sudo[32200]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:27 compute-0 sudo[32352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlzrzrolsrllubgmuptuefitfeonkrbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804247.6815794-73-25187088342035/AnsiballZ_stat.py'
Nov 22 09:37:28 compute-0 sudo[32352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:28 compute-0 python3.9[32354]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:37:28 compute-0 sudo[32352]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:28 compute-0 sudo[32475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hobtxujgbjcqohdgbgdftbnlfodabinp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804247.6815794-73-25187088342035/AnsiballZ_copy.py'
Nov 22 09:37:28 compute-0 sudo[32475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:28 compute-0 python3.9[32477]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804247.6815794-73-25187088342035/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:37:28 compute-0 sudo[32475]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:29 compute-0 sudo[32627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucxqhlsyfisknlgthqmqcjdvzkcshfst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804249.2044418-88-83046269530046/AnsiballZ_setup.py'
Nov 22 09:37:29 compute-0 sudo[32627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:29 compute-0 python3.9[32629]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:37:30 compute-0 sudo[32627]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:30 compute-0 sudo[32783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptaiycywbrfiozbxlkkqqvtrwqisddgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804250.263995-96-140990674975055/AnsiballZ_file.py'
Nov 22 09:37:30 compute-0 sudo[32783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:30 compute-0 python3.9[32785]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:37:30 compute-0 sudo[32783]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:31 compute-0 sudo[32936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyqyzvbxsfktepjzjrowveshfgsgwsjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804251.1564147-105-40356540321952/AnsiballZ_file.py'
Nov 22 09:37:31 compute-0 sudo[32936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:31 compute-0 python3.9[32938]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:37:31 compute-0 sudo[32936]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:32 compute-0 python3.9[33088]: ansible-ansible.builtin.service_facts Invoked
Nov 22 09:37:36 compute-0 python3.9[33341]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:37:37 compute-0 python3.9[33491]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:37:39 compute-0 python3.9[33645]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:37:40 compute-0 sudo[33801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfosutnvzflyffjyzhpuumbvhssnyuji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804259.7448175-153-63513458006612/AnsiballZ_setup.py'
Nov 22 09:37:40 compute-0 sudo[33801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:40 compute-0 python3.9[33803]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:37:40 compute-0 sudo[33801]: pam_unix(sudo:session): session closed for user root
Nov 22 09:37:41 compute-0 sudo[33885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikmrlwnjpwqupwfuwoqerqnowrqyyami ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804259.7448175-153-63513458006612/AnsiballZ_dnf.py'
Nov 22 09:37:41 compute-0 sudo[33885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:37:41 compute-0 python3.9[33887]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:38:24 compute-0 systemd[1]: Reloading.
Nov 22 09:38:24 compute-0 systemd-rc-local-generator[34082]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:38:24 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 22 09:38:24 compute-0 systemd[1]: Reloading.
Nov 22 09:38:25 compute-0 systemd-rc-local-generator[34119]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:38:25 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 22 09:38:25 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 22 09:38:25 compute-0 systemd[1]: Reloading.
Nov 22 09:38:25 compute-0 systemd-rc-local-generator[34166]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:38:25 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 22 09:38:26 compute-0 dbus-broker-launch[812]: Noticed file-system modification, trigger reload.
Nov 22 09:38:26 compute-0 dbus-broker-launch[812]: Noticed file-system modification, trigger reload.
Nov 22 09:38:26 compute-0 dbus-broker-launch[812]: Noticed file-system modification, trigger reload.
Nov 22 09:39:26 compute-0 kernel: SELinux:  Converting 2718 SID table entries...
Nov 22 09:39:26 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:39:26 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 09:39:26 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:39:26 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:39:26 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:39:26 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:39:26 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:39:27 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 22 09:39:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:39:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:39:27 compute-0 systemd[1]: Reloading.
Nov 22 09:39:27 compute-0 systemd-rc-local-generator[34497]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:39:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:39:27 compute-0 sudo[33885]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:39:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:39:28 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.421s CPU time.
Nov 22 09:39:28 compute-0 systemd[1]: run-rb20de48f364c4732a51a0304a72320aa.service: Deactivated successfully.
Nov 22 09:39:28 compute-0 sudo[35406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdwlnpmxckpymjssrbollsorezqscply ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804368.2344809-165-59527764727366/AnsiballZ_command.py'
Nov 22 09:39:28 compute-0 sudo[35406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:28 compute-0 python3.9[35408]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:39:29 compute-0 sudo[35406]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:30 compute-0 sudo[35688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngjlingxqizprwfbsqrviswvrfsurhha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804370.010207-173-29570871220727/AnsiballZ_selinux.py'
Nov 22 09:39:30 compute-0 sudo[35688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:30 compute-0 python3.9[35690]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 22 09:39:31 compute-0 sudo[35688]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:31 compute-0 sudo[35840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpqfhrvjfbvcbhdputbymipygvyyfpqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804371.378945-184-27010534837770/AnsiballZ_command.py'
Nov 22 09:39:31 compute-0 sudo[35840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:31 compute-0 python3.9[35842]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 22 09:39:32 compute-0 sudo[35840]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:33 compute-0 sudo[35993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snsyjuefrdifrqijdaftxeqimimufcio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804373.0470188-192-137345556776938/AnsiballZ_file.py'
Nov 22 09:39:33 compute-0 sudo[35993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:33 compute-0 python3.9[35995]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:39:33 compute-0 sudo[35993]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:34 compute-0 sudo[36145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqlpaevahffscplihiwrgopgoazaflyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804374.1988032-200-47792384614667/AnsiballZ_mount.py'
Nov 22 09:39:34 compute-0 sudo[36145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:35 compute-0 python3.9[36147]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 22 09:39:35 compute-0 sudo[36145]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:36 compute-0 sudo[36297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psivmgqnrbtlzeebimiyfywjaejrpvnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804376.0064616-228-256698864771628/AnsiballZ_file.py'
Nov 22 09:39:36 compute-0 sudo[36297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:36 compute-0 python3.9[36299]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:39:36 compute-0 sudo[36297]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:37 compute-0 sudo[36449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeljivmnvdkwwbsmizgwpywfbwxheuho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804376.7769554-236-139484000048519/AnsiballZ_stat.py'
Nov 22 09:39:37 compute-0 sudo[36449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:37 compute-0 python3.9[36451]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:39:37 compute-0 sudo[36449]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:37 compute-0 sudo[36572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utffvauvqzvjcbmvoawyrpebmrofpeic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804376.7769554-236-139484000048519/AnsiballZ_copy.py'
Nov 22 09:39:37 compute-0 sudo[36572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:39 compute-0 python3.9[36574]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804376.7769554-236-139484000048519/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:39:40 compute-0 sudo[36572]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:40 compute-0 sudo[36724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgcaeohstspjetnvnaxsvkmtfrvxovof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804380.6011755-260-36762132591624/AnsiballZ_stat.py'
Nov 22 09:39:40 compute-0 sudo[36724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:41 compute-0 python3.9[36726]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:39:41 compute-0 sudo[36724]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:41 compute-0 sudo[36876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twmwbdskxdyblbrwxotzqruubcmyxizv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804381.3704062-268-187955296200231/AnsiballZ_command.py'
Nov 22 09:39:41 compute-0 sudo[36876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:41 compute-0 python3.9[36878]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:39:42 compute-0 sudo[36876]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:42 compute-0 sudo[37029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-farmuskyqusqajrnyvpgiftliawwkwbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804382.2885625-276-185532509047677/AnsiballZ_file.py'
Nov 22 09:39:42 compute-0 sudo[37029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:42 compute-0 python3.9[37031]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:39:42 compute-0 sudo[37029]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:43 compute-0 sudo[37181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yodmtuswghtmpquxtoswihgehaseyhic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804383.416173-287-205531487535330/AnsiballZ_getent.py'
Nov 22 09:39:43 compute-0 sudo[37181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:44 compute-0 python3.9[37183]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 22 09:39:44 compute-0 sudo[37181]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:44 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:39:44 compute-0 sudo[37335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quajpnkodqaonveouydyskorftsjwrfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804384.4066188-295-50240660216238/AnsiballZ_group.py'
Nov 22 09:39:44 compute-0 sudo[37335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:45 compute-0 python3.9[37337]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 09:39:45 compute-0 groupadd[37338]: group added to /etc/group: name=qemu, GID=107
Nov 22 09:39:45 compute-0 groupadd[37338]: group added to /etc/gshadow: name=qemu
Nov 22 09:39:45 compute-0 groupadd[37338]: new group: name=qemu, GID=107
Nov 22 09:39:45 compute-0 sudo[37335]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:45 compute-0 sudo[37493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buaatvdpcxvukavdumvethdnhsymjjtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804385.3431163-303-141472230271681/AnsiballZ_user.py'
Nov 22 09:39:45 compute-0 sudo[37493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:46 compute-0 python3.9[37495]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 09:39:46 compute-0 useradd[37497]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 09:39:46 compute-0 sudo[37493]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:46 compute-0 sudo[37653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkppgubdefljbnfnkthrkkrolhuuzvsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804386.5508695-311-270855929676647/AnsiballZ_getent.py'
Nov 22 09:39:46 compute-0 sudo[37653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:47 compute-0 python3.9[37655]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 22 09:39:47 compute-0 sudo[37653]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:47 compute-0 sudo[37806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iixnxsywyzrlgxktjdvuolegombzjyme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804387.366548-319-260732021982891/AnsiballZ_group.py'
Nov 22 09:39:47 compute-0 sudo[37806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:47 compute-0 python3.9[37808]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 09:39:48 compute-0 groupadd[37809]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 22 09:39:48 compute-0 groupadd[37809]: group added to /etc/gshadow: name=hugetlbfs
Nov 22 09:39:48 compute-0 groupadd[37809]: new group: name=hugetlbfs, GID=42477
Nov 22 09:39:48 compute-0 sudo[37806]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:48 compute-0 sudo[37964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lidchjylznuultodesgfvhvcckhlrwya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804388.3188376-328-255266188733850/AnsiballZ_file.py'
Nov 22 09:39:48 compute-0 sudo[37964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:48 compute-0 python3.9[37966]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 22 09:39:48 compute-0 sudo[37964]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:49 compute-0 sudo[38116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-proazvpwqcchbrfhmgqkruoprgunotjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804389.285291-339-280306255870683/AnsiballZ_dnf.py'
Nov 22 09:39:49 compute-0 sudo[38116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:49 compute-0 python3.9[38118]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:39:51 compute-0 sudo[38116]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:51 compute-0 sudo[38269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eavbxouuiiveiscxgeebydcbmamqfkcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804391.6260192-347-145426965780009/AnsiballZ_file.py'
Nov 22 09:39:51 compute-0 sudo[38269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:52 compute-0 python3.9[38271]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:39:52 compute-0 sudo[38269]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:52 compute-0 sudo[38421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpdavogxqbnopnlmwicgjthctpeivynk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804392.4707112-355-269958089811863/AnsiballZ_stat.py'
Nov 22 09:39:52 compute-0 sudo[38421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:53 compute-0 python3.9[38423]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:39:53 compute-0 sudo[38421]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:53 compute-0 sudo[38544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sprtxwqwosgsatcwxoclcropueleiirf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804392.4707112-355-269958089811863/AnsiballZ_copy.py'
Nov 22 09:39:53 compute-0 sudo[38544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:53 compute-0 python3.9[38546]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804392.4707112-355-269958089811863/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:39:53 compute-0 sudo[38544]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:54 compute-0 sudo[38696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcedxpqeojsxblqbnxppysbruupcnaru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804393.9266727-370-75362464822229/AnsiballZ_systemd.py'
Nov 22 09:39:54 compute-0 sudo[38696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:54 compute-0 python3.9[38698]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:39:55 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 22 09:39:55 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 22 09:39:55 compute-0 kernel: Bridge firewalling registered
Nov 22 09:39:55 compute-0 systemd-modules-load[38702]: Inserted module 'br_netfilter'
Nov 22 09:39:55 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 22 09:39:55 compute-0 sudo[38696]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:55 compute-0 sudo[38855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ostvtqlclkobzmfxkmjvpdgvjiasypcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804395.3653088-378-111318871795328/AnsiballZ_stat.py'
Nov 22 09:39:55 compute-0 sudo[38855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:55 compute-0 python3.9[38857]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:39:55 compute-0 sudo[38855]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:56 compute-0 sudo[38978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdkkxjxihtigmjcchftyqjqqqfodeoqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804395.3653088-378-111318871795328/AnsiballZ_copy.py'
Nov 22 09:39:56 compute-0 sudo[38978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:56 compute-0 python3.9[38980]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804395.3653088-378-111318871795328/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:39:56 compute-0 sudo[38978]: pam_unix(sudo:session): session closed for user root
Nov 22 09:39:57 compute-0 sudo[39130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eietjdplbdvuxthqnanmbiktxaenkqci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804396.9782262-396-44834502198346/AnsiballZ_dnf.py'
Nov 22 09:39:57 compute-0 sudo[39130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:39:57 compute-0 python3.9[39132]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:40:00 compute-0 dbus-broker-launch[812]: Noticed file-system modification, trigger reload.
Nov 22 09:40:00 compute-0 dbus-broker-launch[812]: Noticed file-system modification, trigger reload.
Nov 22 09:40:01 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:40:01 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:40:01 compute-0 systemd[1]: Reloading.
Nov 22 09:40:01 compute-0 systemd-rc-local-generator[39187]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:40:01 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:40:01 compute-0 sudo[39130]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:02 compute-0 python3.9[40456]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:40:03 compute-0 python3.9[41489]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 22 09:40:03 compute-0 python3.9[42234]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:40:04 compute-0 sudo[43181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpznywynlubiskzzqjwtoflyuiymqgwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804404.1177263-435-61036923132363/AnsiballZ_command.py'
Nov 22 09:40:04 compute-0 sudo[43181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:04 compute-0 python3.9[43203]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:40:04 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:40:04 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:40:04 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.500s CPU time.
Nov 22 09:40:04 compute-0 systemd[1]: run-rcc75a43f1491407bb2848127698d8675.service: Deactivated successfully.
Nov 22 09:40:04 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 09:40:05 compute-0 systemd[1]: Starting Authorization Manager...
Nov 22 09:40:05 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 09:40:05 compute-0 polkitd[43509]: Started polkitd version 0.117
Nov 22 09:40:05 compute-0 polkitd[43509]: Loading rules from directory /etc/polkit-1/rules.d
Nov 22 09:40:05 compute-0 polkitd[43509]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 22 09:40:05 compute-0 polkitd[43509]: Finished loading, compiling and executing 2 rules
Nov 22 09:40:05 compute-0 systemd[1]: Started Authorization Manager.
Nov 22 09:40:05 compute-0 polkitd[43509]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 22 09:40:05 compute-0 sudo[43181]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:05 compute-0 sudo[43677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjfkpsqhovidgaypwscfgamvclcsqpsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804405.5626905-444-237336536646482/AnsiballZ_systemd.py'
Nov 22 09:40:05 compute-0 sudo[43677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:06 compute-0 python3.9[43679]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:40:06 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 22 09:40:06 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 22 09:40:06 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 22 09:40:06 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 09:40:06 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 09:40:06 compute-0 sudo[43677]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:07 compute-0 python3.9[43840]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 22 09:40:09 compute-0 sudo[43990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htevpodtsksrdxyelkbtgmnazuguxxaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804408.8854463-501-140808879224892/AnsiballZ_systemd.py'
Nov 22 09:40:09 compute-0 sudo[43990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:09 compute-0 python3.9[43992]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:40:09 compute-0 systemd[1]: Reloading.
Nov 22 09:40:09 compute-0 systemd-rc-local-generator[44017]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:40:09 compute-0 sudo[43990]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:10 compute-0 sudo[44179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myfdopeumbkxihwwtnhoaufohywqwkmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804410.0259104-501-84598912903854/AnsiballZ_systemd.py'
Nov 22 09:40:10 compute-0 sudo[44179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:10 compute-0 python3.9[44181]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:40:10 compute-0 systemd[1]: Reloading.
Nov 22 09:40:10 compute-0 systemd-rc-local-generator[44206]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:40:11 compute-0 sudo[44179]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:11 compute-0 sudo[44368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsoneshsoibxuvvqrfszcusqkhscstqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804411.2914407-517-202097412105199/AnsiballZ_command.py'
Nov 22 09:40:11 compute-0 sudo[44368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:11 compute-0 python3.9[44370]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:40:11 compute-0 sudo[44368]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:12 compute-0 sudo[44521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aejubdzdvahrcucmfdcxifrtcojajkzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804412.115573-525-141671288665588/AnsiballZ_command.py'
Nov 22 09:40:12 compute-0 sudo[44521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:12 compute-0 python3.9[44523]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:40:12 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 22 09:40:12 compute-0 sudo[44521]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:13 compute-0 sudo[44674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agwfulyrmydyrngdzvkpqytnvfznbvjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804412.9185076-533-198074782648686/AnsiballZ_command.py'
Nov 22 09:40:13 compute-0 sudo[44674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:13 compute-0 python3.9[44676]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:40:14 compute-0 sudo[44674]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:15 compute-0 sudo[44836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aabntmiyicrymwbghcmfygpxnnqvrcae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804415.0685487-541-143373557077551/AnsiballZ_command.py'
Nov 22 09:40:15 compute-0 sudo[44836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:15 compute-0 python3.9[44838]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:40:15 compute-0 sudo[44836]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:16 compute-0 sudo[44989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpvmavsuqphcvzyyhgeqscwrvawuhnky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804415.7601519-549-34937215962118/AnsiballZ_systemd.py'
Nov 22 09:40:16 compute-0 sudo[44989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:16 compute-0 python3.9[44991]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:40:16 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 09:40:16 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 22 09:40:16 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 22 09:40:16 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 22 09:40:16 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 09:40:16 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 22 09:40:16 compute-0 sudo[44989]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:17 compute-0 sshd-session[31421]: Connection closed by 192.168.122.30 port 53570
Nov 22 09:40:17 compute-0 sshd-session[31418]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:40:17 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 22 09:40:17 compute-0 systemd[1]: session-9.scope: Consumed 2min 16.388s CPU time.
Nov 22 09:40:17 compute-0 systemd-logind[819]: Session 9 logged out. Waiting for processes to exit.
Nov 22 09:40:17 compute-0 systemd-logind[819]: Removed session 9.
Nov 22 09:40:22 compute-0 sshd-session[45021]: Accepted publickey for zuul from 192.168.122.30 port 34690 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:40:22 compute-0 systemd-logind[819]: New session 10 of user zuul.
Nov 22 09:40:22 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 22 09:40:22 compute-0 sshd-session[45021]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:40:23 compute-0 python3.9[45174]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:40:25 compute-0 python3.9[45328]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:40:26 compute-0 sudo[45482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tokpftcwewllqstwgcoqxnbbiurjoewl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804425.7258232-50-236467515403159/AnsiballZ_command.py'
Nov 22 09:40:26 compute-0 sudo[45482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:26 compute-0 python3.9[45484]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:40:26 compute-0 sudo[45482]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:27 compute-0 python3.9[45635]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:40:28 compute-0 sudo[45789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klamrhqgxkuvatrtgfpgeztjeanzublt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804428.0166829-70-167924455394051/AnsiballZ_setup.py'
Nov 22 09:40:28 compute-0 sudo[45789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:28 compute-0 python3.9[45791]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:40:28 compute-0 sudo[45789]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:29 compute-0 sudo[45873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwuslblwuhnsipcpbmrzaycpdiwflvwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804428.0166829-70-167924455394051/AnsiballZ_dnf.py'
Nov 22 09:40:29 compute-0 sudo[45873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:29 compute-0 python3.9[45875]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:40:30 compute-0 sudo[45873]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:31 compute-0 sudo[46026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxergxvdtramcjkkfrdtfouuxiqwfenl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804431.1261792-82-131520338828768/AnsiballZ_setup.py'
Nov 22 09:40:31 compute-0 sudo[46026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:31 compute-0 python3.9[46028]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:40:31 compute-0 sudo[46026]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:32 compute-0 sudo[46197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvfiloijnighakefyczejcaytdfiooqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804432.2453845-93-192215199071609/AnsiballZ_file.py'
Nov 22 09:40:32 compute-0 sudo[46197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:32 compute-0 python3.9[46199]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:40:32 compute-0 sudo[46197]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:33 compute-0 sudo[46349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyqoignajamgrangagxgzgmnckuruhet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804433.1017735-101-250866716134422/AnsiballZ_command.py'
Nov 22 09:40:33 compute-0 sudo[46349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:33 compute-0 python3.9[46351]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:40:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3450260751-merged.mount: Deactivated successfully.
Nov 22 09:40:33 compute-0 podman[46352]: 2025-11-22 09:40:33.63086309 +0000 UTC m=+0.056697371 system refresh
Nov 22 09:40:33 compute-0 sudo[46349]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:34 compute-0 sudo[46512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkmautxeyuodusjknjsvqraicwvpgqvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804433.8606358-109-28683511852208/AnsiballZ_stat.py'
Nov 22 09:40:34 compute-0 sudo[46512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:34 compute-0 python3.9[46514]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:40:34 compute-0 sudo[46512]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:40:35 compute-0 sudo[46635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkiuumfjuoxcfxvyouvlkzhkmgitbyzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804433.8606358-109-28683511852208/AnsiballZ_copy.py'
Nov 22 09:40:35 compute-0 sudo[46635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:35 compute-0 python3.9[46637]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804433.8606358-109-28683511852208/.source.json follow=False _original_basename=podman_network_config.j2 checksum=ce89fb18c18ed70f2e65db04cbeef34060d952d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:40:35 compute-0 sudo[46635]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:35 compute-0 sudo[46787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mubztupkebnquolsloaxcnkkwdkczmwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804435.48108-124-191148442147375/AnsiballZ_stat.py'
Nov 22 09:40:35 compute-0 sudo[46787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:36 compute-0 python3.9[46789]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:40:36 compute-0 sudo[46787]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:36 compute-0 sudo[46910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljxlhmwtqwyfeyaxzydltapmkboxzsku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804435.48108-124-191148442147375/AnsiballZ_copy.py'
Nov 22 09:40:36 compute-0 sudo[46910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:36 compute-0 python3.9[46912]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804435.48108-124-191148442147375/.source.conf follow=False _original_basename=registries.conf.j2 checksum=72d2de0588ced0db2c36127a3128c1d20404e09f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:40:36 compute-0 sudo[46910]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:37 compute-0 sudo[47062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddylyptewykdlmlezocbqnokgwhqrqpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804436.911157-140-151642510563482/AnsiballZ_ini_file.py'
Nov 22 09:40:37 compute-0 sudo[47062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:37 compute-0 python3.9[47064]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:40:37 compute-0 sudo[47062]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:38 compute-0 sudo[47214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyjjpdrtnclvbmcyjylhalnmeinadnik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804437.8156407-140-7944298201193/AnsiballZ_ini_file.py'
Nov 22 09:40:38 compute-0 sudo[47214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:38 compute-0 python3.9[47216]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:40:38 compute-0 sudo[47214]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:38 compute-0 sudo[47366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfeeduvrqntsxzlyqusgxduyajbbafgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804438.5680027-140-272983675707077/AnsiballZ_ini_file.py'
Nov 22 09:40:38 compute-0 sudo[47366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:39 compute-0 python3.9[47368]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:40:39 compute-0 sudo[47366]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:39 compute-0 sudo[47518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntazfhmfyaajcmervnaeiqryjsdmtyki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804439.2155075-140-133290620441741/AnsiballZ_ini_file.py'
Nov 22 09:40:39 compute-0 sudo[47518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:39 compute-0 python3.9[47520]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:40:39 compute-0 sudo[47518]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:40 compute-0 python3.9[47670]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:40:41 compute-0 sudo[47822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cklowdshxwhivcmwyqkmhpvblrqxzwuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804441.111418-180-112082293300023/AnsiballZ_dnf.py'
Nov 22 09:40:41 compute-0 sudo[47822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:41 compute-0 python3.9[47824]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:40:42 compute-0 sudo[47822]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:43 compute-0 sudo[47975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyyssltgivsglisgjqxgtuxdtgeryeij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804443.1945944-188-233196789460122/AnsiballZ_dnf.py'
Nov 22 09:40:43 compute-0 sudo[47975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:43 compute-0 python3.9[47977]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:40:45 compute-0 sudo[47975]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:46 compute-0 sudo[48135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecxljiphydnkyjoihbttdhhqxaezmjnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804445.863653-198-28828303974090/AnsiballZ_dnf.py'
Nov 22 09:40:46 compute-0 sudo[48135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:46 compute-0 python3.9[48137]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:40:47 compute-0 sudo[48135]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:48 compute-0 sudo[48288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkxpsigpyceikoujdvqrckhbagkwtzwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804448.0742576-207-239974711903649/AnsiballZ_dnf.py'
Nov 22 09:40:48 compute-0 sudo[48288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:48 compute-0 python3.9[48290]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:40:49 compute-0 sudo[48288]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:50 compute-0 sudo[48441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpopudnycqylrmseudyhkihwxjhwccwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804450.3638341-218-138500486044502/AnsiballZ_dnf.py'
Nov 22 09:40:50 compute-0 sudo[48441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:50 compute-0 python3.9[48443]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:40:52 compute-0 sudo[48441]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:53 compute-0 sudo[48597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-davnsgqwbkabmlysvdvwilshrkdukapp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804452.8412135-226-265027401720027/AnsiballZ_dnf.py'
Nov 22 09:40:53 compute-0 sudo[48597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:53 compute-0 python3.9[48599]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:40:55 compute-0 sudo[48597]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:56 compute-0 sudo[48767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inssianxtwkegcuhduqglijwrqavbzaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804456.1651316-235-210334454947912/AnsiballZ_dnf.py'
Nov 22 09:40:56 compute-0 sudo[48767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:56 compute-0 python3.9[48769]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:40:57 compute-0 sudo[48767]: pam_unix(sudo:session): session closed for user root
Nov 22 09:40:58 compute-0 sudo[48920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ravyekxpjgwrfwdbcggxulfjbztzscqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804458.2981668-244-158144979918487/AnsiballZ_dnf.py'
Nov 22 09:40:58 compute-0 sudo[48920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:40:58 compute-0 python3.9[48922]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:41:11 compute-0 sudo[48920]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:12 compute-0 sudo[49257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzhkbznfejxkwcewoqjpfpcvhizhsmrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804471.805188-253-67546701106480/AnsiballZ_dnf.py'
Nov 22 09:41:12 compute-0 sudo[49257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:12 compute-0 python3.9[49259]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:41:13 compute-0 sudo[49257]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:14 compute-0 sudo[49413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhtqeqswozhfkppsdmxsekyyhnrgrwry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804474.1217635-264-3240559639050/AnsiballZ_file.py'
Nov 22 09:41:14 compute-0 sudo[49413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:14 compute-0 python3.9[49415]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:41:14 compute-0 sudo[49413]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:15 compute-0 sudo[49588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meaeptrvuqsnvfxvtevhaigtobykmisa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804474.7730896-272-65296318392171/AnsiballZ_stat.py'
Nov 22 09:41:15 compute-0 sudo[49588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:15 compute-0 python3.9[49590]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:41:15 compute-0 sudo[49588]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:15 compute-0 sudo[49711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhetuxlxazytmvgbpnurpeymejzillvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804474.7730896-272-65296318392171/AnsiballZ_copy.py'
Nov 22 09:41:15 compute-0 sudo[49711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:15 compute-0 python3.9[49713]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763804474.7730896-272-65296318392171/.source.json _original_basename=.yplwl5o5 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:41:16 compute-0 sudo[49711]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:16 compute-0 sudo[49863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glzvuqksgeuclcdutfealgdhtwnyqlnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804476.359109-290-69426544143230/AnsiballZ_podman_image.py'
Nov 22 09:41:16 compute-0 sudo[49863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:17 compute-0 python3.9[49865]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 09:41:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3563717335-lower\x2dmapped.mount: Deactivated successfully.
Nov 22 09:41:22 compute-0 podman[49877]: 2025-11-22 09:41:22.356361655 +0000 UTC m=+5.088257275 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 09:41:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:22 compute-0 sudo[49863]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:23 compute-0 sudo[50169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejcminoeadazvxzaqodzrqdajledqfzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804482.8978148-301-184408214354231/AnsiballZ_podman_image.py'
Nov 22 09:41:23 compute-0 sudo[50169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:23 compute-0 python3.9[50171]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 09:41:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:32 compute-0 podman[50183]: 2025-11-22 09:41:32.208288596 +0000 UTC m=+8.609950655 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 09:41:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:32 compute-0 sudo[50169]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:33 compute-0 sudo[50478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syugsusenwnmubhryzuqmarujuhqksio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804492.7698612-311-114923075132722/AnsiballZ_podman_image.py'
Nov 22 09:41:33 compute-0 sudo[50478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:33 compute-0 python3.9[50480]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 09:41:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:34 compute-0 podman[50492]: 2025-11-22 09:41:34.347175114 +0000 UTC m=+1.034148380 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 09:41:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:34 compute-0 sudo[50478]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:35 compute-0 sudo[50723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvunqcrarcadlesliwvrsmzyngrhraca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804494.8020399-320-216160908615429/AnsiballZ_podman_image.py'
Nov 22 09:41:35 compute-0 sudo[50723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:35 compute-0 python3.9[50725]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 09:41:44 compute-0 podman[50738]: 2025-11-22 09:41:44.446850353 +0000 UTC m=+9.098756133 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 09:41:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:44 compute-0 sudo[50723]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:45 compute-0 sudo[50991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvcnxviwfonqjhjstnoobdruuidxknra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804505.2073054-331-107497975764598/AnsiballZ_podman_image.py'
Nov 22 09:41:45 compute-0 sudo[50991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:45 compute-0 python3.9[50993]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 09:41:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:49 compute-0 podman[51005]: 2025-11-22 09:41:49.703034815 +0000 UTC m=+3.855173130 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 09:41:49 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:49 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:49 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:49 compute-0 sudo[50991]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:50 compute-0 sudo[51260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehgtdxrstdamcilwatdgxrnigdyxzbjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804510.1272507-331-23081781571309/AnsiballZ_podman_image.py'
Nov 22 09:41:50 compute-0 sudo[51260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:41:50 compute-0 python3.9[51262]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 09:41:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:51 compute-0 podman[51275]: 2025-11-22 09:41:51.937768999 +0000 UTC m=+1.046376601 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 09:41:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:41:52 compute-0 sudo[51260]: pam_unix(sudo:session): session closed for user root
Nov 22 09:41:52 compute-0 sshd-session[45024]: Connection closed by 192.168.122.30 port 34690
Nov 22 09:41:52 compute-0 sshd-session[45021]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:41:52 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 22 09:41:52 compute-0 systemd[1]: session-10.scope: Consumed 1min 48.851s CPU time.
Nov 22 09:41:52 compute-0 systemd-logind[819]: Session 10 logged out. Waiting for processes to exit.
Nov 22 09:41:52 compute-0 systemd-logind[819]: Removed session 10.
Nov 22 09:41:58 compute-0 sshd-session[51422]: Accepted publickey for zuul from 192.168.122.30 port 59772 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:41:58 compute-0 systemd-logind[819]: New session 11 of user zuul.
Nov 22 09:41:59 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 22 09:41:59 compute-0 sshd-session[51422]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:42:00 compute-0 python3.9[51575]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:42:01 compute-0 sudo[51729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkopqkxajsxduoozzmjhjvkponmqkztd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804520.7915866-36-122938721557851/AnsiballZ_getent.py'
Nov 22 09:42:01 compute-0 sudo[51729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:01 compute-0 python3.9[51731]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 22 09:42:01 compute-0 sudo[51729]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:02 compute-0 sudo[51882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsnvljihlclpdacjrhddnyiipkkvbzwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804521.8578396-44-209720601764784/AnsiballZ_group.py'
Nov 22 09:42:02 compute-0 sudo[51882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:02 compute-0 python3.9[51884]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 09:42:02 compute-0 groupadd[51885]: group added to /etc/group: name=openvswitch, GID=42476
Nov 22 09:42:02 compute-0 groupadd[51885]: group added to /etc/gshadow: name=openvswitch
Nov 22 09:42:02 compute-0 groupadd[51885]: new group: name=openvswitch, GID=42476
Nov 22 09:42:02 compute-0 sudo[51882]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:03 compute-0 sudo[52040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgucdijmyhruddzuryzzdeocnyczfds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804523.0119226-52-11459009782601/AnsiballZ_user.py'
Nov 22 09:42:03 compute-0 sudo[52040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:03 compute-0 python3.9[52042]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 09:42:03 compute-0 useradd[52044]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 09:42:03 compute-0 useradd[52044]: add 'openvswitch' to group 'hugetlbfs'
Nov 22 09:42:03 compute-0 useradd[52044]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 22 09:42:03 compute-0 sudo[52040]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:04 compute-0 sudo[52200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njftxcotpgsyktasoffvewsqpntwpuvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804524.2446258-62-87923597872764/AnsiballZ_setup.py'
Nov 22 09:42:04 compute-0 sudo[52200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:04 compute-0 python3.9[52202]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:42:05 compute-0 sudo[52200]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:05 compute-0 sudo[52284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhbksufatrdrwtwgtmfzcgmsfxrbndty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804524.2446258-62-87923597872764/AnsiballZ_dnf.py'
Nov 22 09:42:05 compute-0 sudo[52284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:05 compute-0 python3.9[52286]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:42:07 compute-0 sudo[52284]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:08 compute-0 sudo[52446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aavbkszikuajavqirqyqqzdpebrhsghv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804527.870699-76-243689815891169/AnsiballZ_dnf.py'
Nov 22 09:42:08 compute-0 sudo[52446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:08 compute-0 python3.9[52448]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:42:36 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Nov 22 09:42:36 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:42:36 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 09:42:36 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:42:36 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:42:36 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:42:36 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:42:36 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:42:36 compute-0 groupadd[52472]: group added to /etc/group: name=unbound, GID=993
Nov 22 09:42:36 compute-0 groupadd[52472]: group added to /etc/gshadow: name=unbound
Nov 22 09:42:36 compute-0 groupadd[52472]: new group: name=unbound, GID=993
Nov 22 09:42:37 compute-0 useradd[52479]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 22 09:42:37 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 22 09:42:37 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 22 09:42:39 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:42:39 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:42:39 compute-0 systemd[1]: Reloading.
Nov 22 09:42:39 compute-0 systemd-rc-local-generator[52974]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:42:39 compute-0 systemd-sysv-generator[52978]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:42:39 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:42:40 compute-0 sudo[52446]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:41 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:42:41 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:42:41 compute-0 systemd[1]: run-rbc6e1416a62a4435af27395ae6ce3f04.service: Deactivated successfully.
Nov 22 09:42:41 compute-0 sudo[53544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnqmqazjzdrrtghoynhuvxrborsoyomc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804560.640931-84-106930457335045/AnsiballZ_systemd.py'
Nov 22 09:42:41 compute-0 sudo[53544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:41 compute-0 python3.9[53547]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:42:41 compute-0 systemd[1]: Reloading.
Nov 22 09:42:41 compute-0 systemd-rc-local-generator[53577]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:42:41 compute-0 systemd-sysv-generator[53580]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:42:41 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 22 09:42:41 compute-0 chown[53588]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 22 09:42:42 compute-0 ovs-ctl[53593]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 22 09:42:42 compute-0 ovs-ctl[53593]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 22 09:42:42 compute-0 ovs-ctl[53593]: Starting ovsdb-server [  OK  ]
Nov 22 09:42:42 compute-0 ovs-vsctl[53642]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 22 09:42:42 compute-0 ovs-vsctl[53658]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"f6533837-2723-4772-a9db-3c9eeea0db5c\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 22 09:42:42 compute-0 ovs-ctl[53593]: Configuring Open vSwitch system IDs [  OK  ]
Nov 22 09:42:42 compute-0 ovs-vsctl[53668]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 22 09:42:42 compute-0 ovs-ctl[53593]: Enabling remote OVSDB managers [  OK  ]
Nov 22 09:42:42 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 22 09:42:42 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 22 09:42:42 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 22 09:42:42 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 22 09:42:42 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 22 09:42:42 compute-0 ovs-ctl[53713]: Inserting openvswitch module [  OK  ]
Nov 22 09:42:42 compute-0 ovs-ctl[53682]: Starting ovs-vswitchd [  OK  ]
Nov 22 09:42:42 compute-0 ovs-vsctl[53731]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 22 09:42:42 compute-0 ovs-ctl[53682]: Enabling remote OVSDB managers [  OK  ]
Nov 22 09:42:42 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 22 09:42:42 compute-0 systemd[1]: Starting Open vSwitch...
Nov 22 09:42:42 compute-0 systemd[1]: Finished Open vSwitch.
Nov 22 09:42:42 compute-0 sudo[53544]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:43 compute-0 python3.9[53882]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:42:44 compute-0 sudo[54032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iabdmxrjdwsptdomcijpxlapcpvhgpdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804563.8159795-102-156332939697870/AnsiballZ_sefcontext.py'
Nov 22 09:42:44 compute-0 sudo[54032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:44 compute-0 python3.9[54034]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 22 09:42:45 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Nov 22 09:42:45 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:42:45 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 09:42:45 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:42:45 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:42:45 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:42:45 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:42:45 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:42:45 compute-0 sudo[54032]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:46 compute-0 python3.9[54189]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:42:47 compute-0 sudo[54345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juggryakymijrmygqeeykvuvwmekimfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804567.3215783-120-150975706880405/AnsiballZ_dnf.py'
Nov 22 09:42:47 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 22 09:42:47 compute-0 sudo[54345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:47 compute-0 python3.9[54347]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:42:49 compute-0 sudo[54345]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:49 compute-0 sudo[54498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sligvwbkkqzykweaefrzsdoyaektwypn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804569.3147929-128-62025402290815/AnsiballZ_command.py'
Nov 22 09:42:49 compute-0 sudo[54498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:49 compute-0 python3.9[54500]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:42:50 compute-0 sudo[54498]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:51 compute-0 sudo[54785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyphpzjmmclfkcpqlidoukhbxcrjtntf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804570.7873535-136-183880563010702/AnsiballZ_file.py'
Nov 22 09:42:51 compute-0 sudo[54785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:51 compute-0 python3.9[54787]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 09:42:51 compute-0 sudo[54785]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:52 compute-0 python3.9[54937]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:42:52 compute-0 sudo[55089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieclspukmxpfjhaatzmduasgxihdetxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804572.575546-152-64447635613860/AnsiballZ_dnf.py'
Nov 22 09:42:52 compute-0 sudo[55089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:53 compute-0 python3.9[55091]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:42:55 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:42:55 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:42:55 compute-0 systemd[1]: Reloading.
Nov 22 09:42:55 compute-0 systemd-sysv-generator[55133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:42:55 compute-0 systemd-rc-local-generator[55130]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:42:55 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:42:55 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:42:55 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:42:55 compute-0 systemd[1]: run-r9cae6c15608e49b285e83efcaa31c679.service: Deactivated successfully.
Nov 22 09:42:55 compute-0 sudo[55089]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:56 compute-0 sudo[55405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liqfufygjuielmlqogxexrrymuwlucgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804575.8139484-160-155182776131170/AnsiballZ_systemd.py'
Nov 22 09:42:56 compute-0 sudo[55405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:56 compute-0 python3.9[55407]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:42:56 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 09:42:56 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 22 09:42:56 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 22 09:42:56 compute-0 systemd[1]: Stopping Network Manager...
Nov 22 09:42:56 compute-0 NetworkManager[7185]: <info>  [1763804576.5683] caught SIGTERM, shutting down normally.
Nov 22 09:42:56 compute-0 NetworkManager[7185]: <info>  [1763804576.5702] dhcp4 (eth0): canceled DHCP transaction
Nov 22 09:42:56 compute-0 NetworkManager[7185]: <info>  [1763804576.5703] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:42:56 compute-0 NetworkManager[7185]: <info>  [1763804576.5703] dhcp4 (eth0): state changed no lease
Nov 22 09:42:56 compute-0 NetworkManager[7185]: <info>  [1763804576.5707] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 09:42:56 compute-0 NetworkManager[7185]: <info>  [1763804576.5819] exiting (success)
Nov 22 09:42:56 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 09:42:56 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 09:42:56 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 09:42:56 compute-0 systemd[1]: Stopped Network Manager.
Nov 22 09:42:56 compute-0 systemd[1]: NetworkManager.service: Consumed 14.420s CPU time, 4.0M memory peak, read 0B from disk, written 41.0K to disk.
Nov 22 09:42:56 compute-0 systemd[1]: Starting Network Manager...
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.6711] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3931c0a4-baf1-4f89-bc19-8b6e9c477257)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.6712] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.6767] manager[0x55c50ebb6090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 09:42:56 compute-0 systemd[1]: Starting Hostname Service...
Nov 22 09:42:56 compute-0 systemd[1]: Started Hostname Service.
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7511] hostname: hostname: using hostnamed
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7511] hostname: static hostname changed from (none) to "compute-0"
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7516] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7520] manager[0x55c50ebb6090]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7520] manager[0x55c50ebb6090]: rfkill: WWAN hardware radio set enabled
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7538] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7546] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7547] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7547] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7548] manager: Networking is enabled by state file
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7549] settings: Loaded settings plugin: keyfile (internal)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7552] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7574] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7583] dhcp: init: Using DHCP client 'internal'
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7585] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7589] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7593] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7599] device (lo): Activation: starting connection 'lo' (6f9484ed-c385-45a5-b64f-d6d1d6763032)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7604] device (eth0): carrier: link connected
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7608] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7612] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7612] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7618] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7624] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7629] device (eth1): carrier: link connected
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7632] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7636] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (6f14d473-5f78-55da-8fa7-2b29b4ec1411) (indicated)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7637] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7640] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7646] device (eth1): Activation: starting connection 'ci-private-network' (6f14d473-5f78-55da-8fa7-2b29b4ec1411)
Nov 22 09:42:56 compute-0 systemd[1]: Started Network Manager.
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7654] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7668] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7670] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7671] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7673] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7676] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7678] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7679] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7681] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7686] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7689] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7697] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7709] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7719] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7720] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7724] device (lo): Activation: successful, device activated.
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7731] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7732] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7734] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7737] device (eth1): Activation: successful, device activated.
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7745] dhcp4 (eth0): state changed new lease, address=38.129.56.220
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7753] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 09:42:56 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7850] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7888] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7890] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7894] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7899] device (eth0): Activation: successful, device activated.
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7905] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 09:42:56 compute-0 NetworkManager[55425]: <info>  [1763804576.7908] manager: startup complete
Nov 22 09:42:56 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 22 09:42:56 compute-0 sudo[55405]: pam_unix(sudo:session): session closed for user root
Nov 22 09:42:57 compute-0 sudo[55631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkkvkztnuwfkdhncptlkzgtyzziqezxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804577.0351462-168-117880087791461/AnsiballZ_dnf.py'
Nov 22 09:42:57 compute-0 sudo[55631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:42:57 compute-0 python3.9[55633]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:43:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:43:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:43:02 compute-0 systemd[1]: Reloading.
Nov 22 09:43:02 compute-0 systemd-rc-local-generator[55690]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:43:02 compute-0 systemd-sysv-generator[55693]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:43:02 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:43:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:43:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:43:03 compute-0 systemd[1]: run-r28112a31d9e84203878e144d09dc528d.service: Deactivated successfully.
Nov 22 09:43:03 compute-0 sudo[55631]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:03 compute-0 sudo[56092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egfyqopivjoyiaewwegdwoogymuezdhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804583.4819517-180-135780109559249/AnsiballZ_stat.py'
Nov 22 09:43:03 compute-0 sudo[56092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:03 compute-0 python3.9[56094]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:43:03 compute-0 sudo[56092]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:04 compute-0 sudo[56244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcxmtuovkpjdmcdgxlhtwxtcswazrdnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804584.2187104-189-148854634604866/AnsiballZ_ini_file.py'
Nov 22 09:43:04 compute-0 sudo[56244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:04 compute-0 python3.9[56246]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:04 compute-0 sudo[56244]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:05 compute-0 sudo[56398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoynjxcuiuoxptuujdoyhadvhnpycogy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804585.1871023-199-53974111714587/AnsiballZ_ini_file.py'
Nov 22 09:43:05 compute-0 sudo[56398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:05 compute-0 python3.9[56400]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:05 compute-0 sudo[56398]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:06 compute-0 sudo[56550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqmjmblszekyxzjojsjklonwxnjmatnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804585.9295592-199-202395638747125/AnsiballZ_ini_file.py'
Nov 22 09:43:06 compute-0 sudo[56550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:06 compute-0 python3.9[56552]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:06 compute-0 sudo[56550]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:06 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 09:43:07 compute-0 sudo[56702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfchavkkkzwpguioqvdkqzjmjrvmcqxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804586.7936027-214-54270741082659/AnsiballZ_ini_file.py'
Nov 22 09:43:07 compute-0 sudo[56702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:07 compute-0 python3.9[56704]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:07 compute-0 sudo[56702]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:07 compute-0 sudo[56854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orovosvczufiipjeefhpqvrrstuobgcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804587.5511065-214-30247142300324/AnsiballZ_ini_file.py'
Nov 22 09:43:07 compute-0 sudo[56854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:08 compute-0 python3.9[56856]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:08 compute-0 sudo[56854]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:08 compute-0 sudo[57006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihcipormnkzrydxjzakchkaoesxhwrwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804588.4957993-229-167529103761033/AnsiballZ_stat.py'
Nov 22 09:43:08 compute-0 sudo[57006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:09 compute-0 python3.9[57008]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:43:09 compute-0 sudo[57006]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:09 compute-0 sudo[57129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvtqbvroqakdommjoqxyujbusoezfzdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804588.4957993-229-167529103761033/AnsiballZ_copy.py'
Nov 22 09:43:09 compute-0 sudo[57129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:09 compute-0 python3.9[57131]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804588.4957993-229-167529103761033/.source _original_basename=.dter9efi follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:09 compute-0 sudo[57129]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:10 compute-0 sudo[57281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iympbcjyiaazzbsdjltvfjdnjjyzcalg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804590.1392558-244-163504227655040/AnsiballZ_file.py'
Nov 22 09:43:10 compute-0 sudo[57281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:10 compute-0 python3.9[57283]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:10 compute-0 sudo[57281]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:11 compute-0 sudo[57433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdrhjlqkwjzszzoazbeocggzwzhkdlav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804590.9308577-252-120685085331665/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 22 09:43:11 compute-0 sudo[57433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:11 compute-0 python3.9[57435]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 22 09:43:11 compute-0 sudo[57433]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:12 compute-0 sudo[57585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inytbcgckdtlzkypcajqmctcauolbftu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804591.9333282-261-240437699403430/AnsiballZ_file.py'
Nov 22 09:43:12 compute-0 sudo[57585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:12 compute-0 python3.9[57587]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:12 compute-0 sudo[57585]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:13 compute-0 sudo[57737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfsjaropyfephogxxkrppiwoklpxkwgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804593.0841098-271-127028609096689/AnsiballZ_stat.py'
Nov 22 09:43:13 compute-0 sudo[57737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:13 compute-0 sudo[57737]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:14 compute-0 sudo[57860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efihehpwrdjmtcoxbnfjojdzqzhietxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804593.0841098-271-127028609096689/AnsiballZ_copy.py'
Nov 22 09:43:14 compute-0 sudo[57860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:14 compute-0 sudo[57860]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:14 compute-0 sudo[58012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xetwjgmqyjynxgwlndkbnlukhtulwvjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804594.449974-286-96489295023130/AnsiballZ_slurp.py'
Nov 22 09:43:14 compute-0 sudo[58012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:15 compute-0 python3.9[58014]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 22 09:43:15 compute-0 sudo[58012]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:16 compute-0 sudo[58187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xerjbqfiptjzkpilruovsjhmlzshavod ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804595.4345899-295-189694795450980/async_wrapper.py j778436535616 300 /home/zuul/.ansible/tmp/ansible-tmp-1763804595.4345899-295-189694795450980/AnsiballZ_edpm_os_net_config.py _'
Nov 22 09:43:16 compute-0 sudo[58187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:16 compute-0 ansible-async_wrapper.py[58189]: Invoked with j778436535616 300 /home/zuul/.ansible/tmp/ansible-tmp-1763804595.4345899-295-189694795450980/AnsiballZ_edpm_os_net_config.py _
Nov 22 09:43:16 compute-0 ansible-async_wrapper.py[58192]: Starting module and watcher
Nov 22 09:43:16 compute-0 ansible-async_wrapper.py[58192]: Start watching 58193 (300)
Nov 22 09:43:16 compute-0 ansible-async_wrapper.py[58193]: Start module (58193)
Nov 22 09:43:16 compute-0 ansible-async_wrapper.py[58189]: Return async_wrapper task started.
Nov 22 09:43:16 compute-0 sudo[58187]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:16 compute-0 python3.9[58194]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 22 09:43:17 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 22 09:43:17 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 22 09:43:17 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 22 09:43:17 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 22 09:43:17 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4015] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4037] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4764] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4769] audit: op="connection-add" uuid="91b40b80-af4f-4d73-9009-c6034f7ad414" name="br-ex-br" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4797] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4802] audit: op="connection-add" uuid="4af7d2b2-1cc6-4681-9d08-854ac8d76017" name="br-ex-port" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4818] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4820] audit: op="connection-add" uuid="ae8290c0-3ebc-47f6-bbc9-5c833b18015b" name="eth1-port" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4833] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4834] audit: op="connection-add" uuid="30f5dc8e-3196-4574-a5f4-56fe71a98d43" name="vlan20-port" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4848] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4849] audit: op="connection-add" uuid="26c963d2-d2a7-4328-8081-6e7d63d2493b" name="vlan21-port" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4862] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4863] audit: op="connection-add" uuid="11379db7-c85a-467e-8f19-213a97f99e40" name="vlan22-port" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4884] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4901] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4902] audit: op="connection-add" uuid="e09ae67f-e656-4012-95a8-1e5dab413c28" name="br-ex-if" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4956] audit: op="connection-update" uuid="6f14d473-5f78-55da-8fa7-2b29b4ec1411" name="ci-private-network" args="ipv4.never-default,ipv4.addresses,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.routes,connection.port-type,connection.slave-type,connection.controller,connection.master,connection.timestamp,ovs-interface.type,ovs-external-ids.data,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv6.routes" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4979] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.4982] audit: op="connection-add" uuid="32355fd2-1b73-47b9-b405-7659f1631325" name="vlan20-if" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5003] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5006] audit: op="connection-add" uuid="a989f956-cb78-4c4e-9e69-e07608aaa8b3" name="vlan21-if" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5029] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5032] audit: op="connection-add" uuid="e20ece41-486a-40f2-8201-8c3cf401f942" name="vlan22-if" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5047] audit: op="connection-delete" uuid="2d6579af-6f98-3948-b6ff-43aec0679b8f" name="Wired connection 1" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5062] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5076] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5080] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (91b40b80-af4f-4d73-9009-c6034f7ad414)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5084] audit: op="connection-activate" uuid="91b40b80-af4f-4d73-9009-c6034f7ad414" name="br-ex-br" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5087] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5095] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5102] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (4af7d2b2-1cc6-4681-9d08-854ac8d76017)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5109] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5117] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5124] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (ae8290c0-3ebc-47f6-bbc9-5c833b18015b)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5127] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5139] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5144] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (30f5dc8e-3196-4574-a5f4-56fe71a98d43)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5148] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5158] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5164] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (26c963d2-d2a7-4328-8081-6e7d63d2493b)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5168] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5177] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5182] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (11379db7-c85a-467e-8f19-213a97f99e40)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5183] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5188] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5191] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5199] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5206] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5212] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e09ae67f-e656-4012-95a8-1e5dab413c28)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5213] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5218] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5221] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5223] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5226] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5241] device (eth1): disconnecting for new activation request.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5242] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5248] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5250] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5253] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5257] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5263] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5270] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (32355fd2-1b73-47b9-b405-7659f1631325)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5272] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5276] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5279] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5281] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5285] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5296] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5304] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (a989f956-cb78-4c4e-9e69-e07608aaa8b3)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5305] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5310] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5312] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5314] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5318] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5324] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5330] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (e20ece41-486a-40f2-8201-8c3cf401f942)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5331] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5335] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5338] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5340] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5342] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5364] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.method" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5366] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5372] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5374] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5384] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5390] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5396] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5400] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5404] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5410] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5415] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5418] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5420] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5426] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5430] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5433] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5435] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5441] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5446] dhcp4 (eth0): canceled DHCP transaction
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5446] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5446] dhcp4 (eth0): state changed no lease
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5448] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5463] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 22 09:43:18 compute-0 kernel: Timeout policy base is empty
Nov 22 09:43:18 compute-0 systemd-udevd[58200]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5468] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58195 uid=0 result="fail" reason="Device is not activated"
Nov 22 09:43:18 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5509] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5516] dhcp4 (eth0): state changed new lease, address=38.129.56.220
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5520] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5564] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5571] device (eth1): disconnecting for new activation request.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5572] audit: op="connection-activate" uuid="6f14d473-5f78-55da-8fa7-2b29b4ec1411" name="ci-private-network" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5631] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58195 uid=0 result="success"
Nov 22 09:43:18 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5699] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5836] device (eth1): Activation: starting connection 'ci-private-network' (6f14d473-5f78-55da-8fa7-2b29b4ec1411)
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5841] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5850] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5856] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5864] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5870] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5876] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5877] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5879] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5880] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5881] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5899] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5908] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5914] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5918] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5923] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5928] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5932] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5937] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5942] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5946] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5951] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5957] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5965] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 kernel: br-ex: entered promiscuous mode
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.5998] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6003] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6009] device (eth1): Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6126] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6138] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6160] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6162] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6169] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 kernel: vlan22: entered promiscuous mode
Nov 22 09:43:18 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 22 09:43:18 compute-0 kernel: vlan21: entered promiscuous mode
Nov 22 09:43:18 compute-0 systemd-udevd[58201]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6308] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6321] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6340] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6342] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6349] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 kernel: vlan20: entered promiscuous mode
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6456] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6476] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6496] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6499] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6508] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6529] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6549] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6594] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6597] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 09:43:18 compute-0 NetworkManager[55425]: <info>  [1763804598.6609] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 09:43:19 compute-0 NetworkManager[55425]: <info>  [1763804599.7952] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58195 uid=0 result="success"
Nov 22 09:43:19 compute-0 NetworkManager[55425]: <info>  [1763804599.9636] checkpoint[0x55c50eb8c950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 22 09:43:19 compute-0 NetworkManager[55425]: <info>  [1763804599.9637] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58195 uid=0 result="success"
Nov 22 09:43:20 compute-0 sudo[58526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwraeatxjliedfkefkevsjtjsxroglaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804599.5513985-295-178039640665127/AnsiballZ_async_status.py'
Nov 22 09:43:20 compute-0 sudo[58526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.2732] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58195 uid=0 result="success"
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.2787] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58195 uid=0 result="success"
Nov 22 09:43:20 compute-0 python3.9[58528]: ansible-ansible.legacy.async_status Invoked with jid=j778436535616.58189 mode=status _async_dir=/root/.ansible_async
Nov 22 09:43:20 compute-0 sudo[58526]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.5596] audit: op="networking-control" arg="global-dns-configuration" pid=58195 uid=0 result="success"
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.6640] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.7042] audit: op="networking-control" arg="global-dns-configuration" pid=58195 uid=0 result="success"
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.7084] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58195 uid=0 result="success"
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.9337] checkpoint[0x55c50eb8ca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 22 09:43:20 compute-0 NetworkManager[55425]: <info>  [1763804600.9346] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58195 uid=0 result="success"
Nov 22 09:43:21 compute-0 ansible-async_wrapper.py[58193]: Module complete (58193)
Nov 22 09:43:21 compute-0 ansible-async_wrapper.py[58192]: Done in kid B.
Nov 22 09:43:23 compute-0 sudo[58631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkkikrbqjvkxlxgqpbhusvmyduerzbva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804599.5513985-295-178039640665127/AnsiballZ_async_status.py'
Nov 22 09:43:23 compute-0 sudo[58631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:23 compute-0 python3.9[58633]: ansible-ansible.legacy.async_status Invoked with jid=j778436535616.58189 mode=status _async_dir=/root/.ansible_async
Nov 22 09:43:23 compute-0 sudo[58631]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:24 compute-0 sudo[58731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngkmmwilwtulxtraqxcbtelhuqmqiwub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804599.5513985-295-178039640665127/AnsiballZ_async_status.py'
Nov 22 09:43:24 compute-0 sudo[58731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:24 compute-0 python3.9[58733]: ansible-ansible.legacy.async_status Invoked with jid=j778436535616.58189 mode=cleanup _async_dir=/root/.ansible_async
Nov 22 09:43:24 compute-0 sudo[58731]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:25 compute-0 sudo[58883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulscslfuhzbyievpfxvbopmleutxzjfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804604.7431307-322-114407378469397/AnsiballZ_stat.py'
Nov 22 09:43:25 compute-0 sudo[58883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:25 compute-0 python3.9[58885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:43:25 compute-0 sudo[58883]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:25 compute-0 sudo[59006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqxfibtagmtfmzcqflyvjakhdcpswhgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804604.7431307-322-114407378469397/AnsiballZ_copy.py'
Nov 22 09:43:25 compute-0 sudo[59006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:26 compute-0 python3.9[59008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804604.7431307-322-114407378469397/.source.returncode _original_basename=.84nwyycc follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:26 compute-0 sudo[59006]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:26 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 09:43:27 compute-0 sudo[59161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhddhxxhyniocsnqybhkwjrkodqexrno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804606.5463178-338-255568145704909/AnsiballZ_stat.py'
Nov 22 09:43:27 compute-0 sudo[59161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:27 compute-0 python3.9[59163]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:43:27 compute-0 sudo[59161]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:27 compute-0 sudo[59284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxwodhliryphvubynxswjcsotabpbojn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804606.5463178-338-255568145704909/AnsiballZ_copy.py'
Nov 22 09:43:27 compute-0 sudo[59284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:28 compute-0 python3.9[59286]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804606.5463178-338-255568145704909/.source.cfg _original_basename=.19g5fsfu follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:28 compute-0 sudo[59284]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:28 compute-0 sudo[59436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipqqbzemnkyiuwbimwommzgvkesczbpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804608.376803-353-31462032548265/AnsiballZ_systemd.py'
Nov 22 09:43:28 compute-0 sudo[59436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:29 compute-0 python3.9[59438]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:43:29 compute-0 systemd[1]: Reloading Network Manager...
Nov 22 09:43:29 compute-0 NetworkManager[55425]: <info>  [1763804609.2076] audit: op="reload" arg="0" pid=59442 uid=0 result="success"
Nov 22 09:43:29 compute-0 NetworkManager[55425]: <info>  [1763804609.2082] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 22 09:43:29 compute-0 systemd[1]: Reloaded Network Manager.
Nov 22 09:43:29 compute-0 sudo[59436]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:29 compute-0 sshd-session[51425]: Connection closed by 192.168.122.30 port 59772
Nov 22 09:43:29 compute-0 sshd-session[51422]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:43:29 compute-0 systemd-logind[819]: Session 11 logged out. Waiting for processes to exit.
Nov 22 09:43:29 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 22 09:43:29 compute-0 systemd[1]: session-11.scope: Consumed 51.910s CPU time.
Nov 22 09:43:29 compute-0 systemd-logind[819]: Removed session 11.
Nov 22 09:43:35 compute-0 sshd-session[59473]: Accepted publickey for zuul from 192.168.122.30 port 36366 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:43:35 compute-0 systemd-logind[819]: New session 12 of user zuul.
Nov 22 09:43:35 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 22 09:43:35 compute-0 sshd-session[59473]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:43:36 compute-0 python3.9[59626]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:43:37 compute-0 python3.9[59781]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:43:38 compute-0 python3.9[59970]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:43:39 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 09:43:39 compute-0 sshd-session[59476]: Connection closed by 192.168.122.30 port 36366
Nov 22 09:43:39 compute-0 sshd-session[59473]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:43:39 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 22 09:43:39 compute-0 systemd[1]: session-12.scope: Consumed 2.449s CPU time.
Nov 22 09:43:39 compute-0 systemd-logind[819]: Session 12 logged out. Waiting for processes to exit.
Nov 22 09:43:39 compute-0 systemd-logind[819]: Removed session 12.
Nov 22 09:43:44 compute-0 sshd-session[59999]: Accepted publickey for zuul from 192.168.122.30 port 44274 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:43:44 compute-0 systemd-logind[819]: New session 13 of user zuul.
Nov 22 09:43:44 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 22 09:43:44 compute-0 sshd-session[59999]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:43:45 compute-0 python3.9[60152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:43:46 compute-0 python3.9[60306]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:43:46 compute-0 sudo[60461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpzpfxljomxikboebozvminzlmrinekw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804626.6580315-40-47159888846864/AnsiballZ_setup.py'
Nov 22 09:43:47 compute-0 sudo[60461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:47 compute-0 python3.9[60463]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:43:47 compute-0 sudo[60461]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:47 compute-0 sudo[60545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-somiurftawkcwtfkmvowoverjhqbsjiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804626.6580315-40-47159888846864/AnsiballZ_dnf.py'
Nov 22 09:43:48 compute-0 sudo[60545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:48 compute-0 python3.9[60547]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:43:49 compute-0 sudo[60545]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:50 compute-0 sudo[60699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydnufceimzsazipafaldrksjjfcgihgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804629.7845469-52-198927025581759/AnsiballZ_setup.py'
Nov 22 09:43:50 compute-0 sudo[60699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:50 compute-0 python3.9[60701]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:43:50 compute-0 sudo[60699]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:51 compute-0 sudo[60890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpdslijmvjlexhkdhfljeekqfktpbnxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804631.2011638-63-27998845982367/AnsiballZ_file.py'
Nov 22 09:43:51 compute-0 sudo[60890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:51 compute-0 python3.9[60892]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:51 compute-0 sudo[60890]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:52 compute-0 sudo[61042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psawkzjatxucwdhhualmevpalzkxnkzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804632.0764823-71-226666180665239/AnsiballZ_command.py'
Nov 22 09:43:52 compute-0 sudo[61042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:52 compute-0 python3.9[61044]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:43:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:43:52 compute-0 sudo[61042]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:53 compute-0 sudo[61204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bovbcmjcbzcntyrqjcvzrywzdxcnmbie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804633.0005882-79-157171107232854/AnsiballZ_stat.py'
Nov 22 09:43:53 compute-0 sudo[61204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:53 compute-0 python3.9[61206]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:43:53 compute-0 sudo[61204]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:53 compute-0 sudo[61282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kefppdeiguzmdsbyqyuyhssstvlvqfrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804633.0005882-79-157171107232854/AnsiballZ_file.py'
Nov 22 09:43:53 compute-0 sudo[61282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:54 compute-0 python3.9[61284]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:43:54 compute-0 sudo[61282]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:54 compute-0 sudo[61434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwmwdmytoxiyflkstdqlbvzmlhqcloac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804634.3206267-91-106229229921277/AnsiballZ_stat.py'
Nov 22 09:43:54 compute-0 sudo[61434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:54 compute-0 python3.9[61436]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:43:54 compute-0 sudo[61434]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:55 compute-0 sudo[61512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldxlsojbhtempcbpxaenbdvaimcgavyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804634.3206267-91-106229229921277/AnsiballZ_file.py'
Nov 22 09:43:55 compute-0 sudo[61512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:55 compute-0 python3.9[61514]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:43:55 compute-0 sudo[61512]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:55 compute-0 sudo[61664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbpbessfkeycthllfvbjnbhbidnxnokj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804635.446563-104-78911427331055/AnsiballZ_ini_file.py'
Nov 22 09:43:55 compute-0 sudo[61664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:56 compute-0 python3.9[61666]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:43:56 compute-0 sudo[61664]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:56 compute-0 sudo[61816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvgyyqtnedfkkpbfojvsplbqncnydwxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804636.3960612-104-65965750905571/AnsiballZ_ini_file.py'
Nov 22 09:43:56 compute-0 sudo[61816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:56 compute-0 python3.9[61818]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:43:56 compute-0 sudo[61816]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:57 compute-0 sudo[61968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kawqemhdpkhbnuwtmsmeyefollwruqrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804637.0968924-104-174062005435254/AnsiballZ_ini_file.py'
Nov 22 09:43:57 compute-0 sudo[61968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:57 compute-0 python3.9[61970]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:43:57 compute-0 sudo[61968]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:58 compute-0 sudo[62120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acerpnhziwmaggxlbanebkbgbfohbytp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804637.8047469-104-53990855372304/AnsiballZ_ini_file.py'
Nov 22 09:43:58 compute-0 sudo[62120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:58 compute-0 python3.9[62122]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:43:58 compute-0 sudo[62120]: pam_unix(sudo:session): session closed for user root
Nov 22 09:43:58 compute-0 sudo[62272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tybjcrlmvfckzewdazwhlcvwzeoafxqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804638.590556-135-171698337481135/AnsiballZ_dnf.py'
Nov 22 09:43:58 compute-0 sudo[62272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:43:59 compute-0 python3.9[62274]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:44:00 compute-0 sudo[62272]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:01 compute-0 sudo[62425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgcmcnsirfwmqibfebcxsbowqlskvqpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804640.962098-146-5926598697816/AnsiballZ_setup.py'
Nov 22 09:44:01 compute-0 sudo[62425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:01 compute-0 python3.9[62427]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:44:01 compute-0 sudo[62425]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:02 compute-0 sudo[62579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcapgnvpvaydfwogprqbvydhwvoudqjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804641.7942772-154-237266434834158/AnsiballZ_stat.py'
Nov 22 09:44:02 compute-0 sudo[62579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:02 compute-0 python3.9[62581]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:44:02 compute-0 sudo[62579]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:02 compute-0 sudo[62731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlrhxyqxempbbacucuiqdjnubeitxzpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804642.6063523-163-249086050962845/AnsiballZ_stat.py'
Nov 22 09:44:02 compute-0 sudo[62731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:03 compute-0 python3.9[62733]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:44:03 compute-0 sudo[62731]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:03 compute-0 sudo[62883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggfikcaajfjoidiwohqycpufaggpojyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804643.4846313-173-66160712712930/AnsiballZ_command.py'
Nov 22 09:44:03 compute-0 sudo[62883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:04 compute-0 python3.9[62885]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:44:04 compute-0 sudo[62883]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:04 compute-0 sudo[63036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvcetudlvqxlbfexnotqqyphaqwhnlzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804644.3403363-183-260848042363039/AnsiballZ_service_facts.py'
Nov 22 09:44:04 compute-0 sudo[63036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:05 compute-0 python3.9[63038]: ansible-service_facts Invoked
Nov 22 09:44:05 compute-0 network[63055]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 09:44:05 compute-0 network[63056]: 'network-scripts' will be removed from distribution in near future.
Nov 22 09:44:05 compute-0 network[63057]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 09:44:09 compute-0 sudo[63036]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:10 compute-0 sudo[63340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tekiqlartrcgkyrytdfumnadfdmmlqht ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1763804650.0044641-198-68114389614618/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1763804650.0044641-198-68114389614618/args'
Nov 22 09:44:10 compute-0 sudo[63340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:10 compute-0 sudo[63340]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:11 compute-0 sudo[63507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jccxyzutkdivfvyxsmllsklotjqambla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804650.87267-209-191289754288131/AnsiballZ_dnf.py'
Nov 22 09:44:11 compute-0 sudo[63507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:11 compute-0 python3.9[63509]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:44:12 compute-0 sudo[63507]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:13 compute-0 sudo[63661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tadcfnpwlxgjodqnstlzuwzghwcmlzxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804653.1085384-222-139737750889840/AnsiballZ_package_facts.py'
Nov 22 09:44:13 compute-0 sudo[63661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:14 compute-0 python3.9[63663]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 22 09:44:14 compute-0 sudo[63661]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:15 compute-0 sudo[63813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikxfttwpfkozrlltbcnsecowmjjeiqfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804654.9303417-232-112272497786946/AnsiballZ_stat.py'
Nov 22 09:44:15 compute-0 sudo[63813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:15 compute-0 python3.9[63815]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:15 compute-0 sudo[63813]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:16 compute-0 sudo[63938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgsdqjkzszqsrbagjqtivohwtmirzqdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804654.9303417-232-112272497786946/AnsiballZ_copy.py'
Nov 22 09:44:16 compute-0 sudo[63938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:16 compute-0 python3.9[63940]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804654.9303417-232-112272497786946/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:16 compute-0 sudo[63938]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:17 compute-0 sudo[64092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuvhvxjfxzgqztbiqbvdmquqvmauwkbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804656.707968-247-231110973035142/AnsiballZ_stat.py'
Nov 22 09:44:17 compute-0 sudo[64092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:17 compute-0 python3.9[64094]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:17 compute-0 sudo[64092]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:17 compute-0 sudo[64217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxsdfeixlnwwxllzflddbeshapeqepcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804656.707968-247-231110973035142/AnsiballZ_copy.py'
Nov 22 09:44:17 compute-0 sudo[64217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:17 compute-0 python3.9[64219]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804656.707968-247-231110973035142/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:17 compute-0 sudo[64217]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:18 compute-0 sudo[64371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdaxvlwmjihkzlfklqpgxnjklrikqeph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804658.5078576-268-147510798204548/AnsiballZ_lineinfile.py'
Nov 22 09:44:18 compute-0 sudo[64371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:19 compute-0 python3.9[64373]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:19 compute-0 sudo[64371]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:20 compute-0 sudo[64525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcgjurzluqmlngclhqmwdminbkmgdwwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804659.9148977-283-202437015363035/AnsiballZ_setup.py'
Nov 22 09:44:20 compute-0 sudo[64525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:20 compute-0 python3.9[64527]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:44:20 compute-0 sudo[64525]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:21 compute-0 sudo[64609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qksokpuqgnuhmtvdmavmelqsetmxzcvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804659.9148977-283-202437015363035/AnsiballZ_systemd.py'
Nov 22 09:44:21 compute-0 sudo[64609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:21 compute-0 python3.9[64611]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:44:21 compute-0 sudo[64609]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:22 compute-0 sudo[64763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadqlxbsqsaawgutfnzgdvwwwsyxmumq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804662.3480816-299-264871063036622/AnsiballZ_setup.py'
Nov 22 09:44:22 compute-0 sudo[64763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:23 compute-0 python3.9[64765]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:44:23 compute-0 sudo[64763]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:23 compute-0 sudo[64847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekxcusltwlmtjhdzzeeoavklfgqsukhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804662.3480816-299-264871063036622/AnsiballZ_systemd.py'
Nov 22 09:44:23 compute-0 sudo[64847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:23 compute-0 python3.9[64849]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:44:23 compute-0 chronyd[827]: chronyd exiting
Nov 22 09:44:23 compute-0 systemd[1]: Stopping NTP client/server...
Nov 22 09:44:23 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 22 09:44:23 compute-0 systemd[1]: Stopped NTP client/server.
Nov 22 09:44:23 compute-0 systemd[1]: Starting NTP client/server...
Nov 22 09:44:23 compute-0 chronyd[64858]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 09:44:23 compute-0 chronyd[64858]: Frequency -26.975 +/- 0.155 ppm read from /var/lib/chrony/drift
Nov 22 09:44:23 compute-0 chronyd[64858]: Loaded seccomp filter (level 2)
Nov 22 09:44:23 compute-0 systemd[1]: Started NTP client/server.
Nov 22 09:44:24 compute-0 sudo[64847]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:24 compute-0 sshd-session[60002]: Connection closed by 192.168.122.30 port 44274
Nov 22 09:44:24 compute-0 sshd-session[59999]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:44:24 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 22 09:44:24 compute-0 systemd[1]: session-13.scope: Consumed 27.042s CPU time.
Nov 22 09:44:24 compute-0 systemd-logind[819]: Session 13 logged out. Waiting for processes to exit.
Nov 22 09:44:24 compute-0 systemd-logind[819]: Removed session 13.
Nov 22 09:44:30 compute-0 sshd-session[64885]: Accepted publickey for zuul from 192.168.122.30 port 55398 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:44:30 compute-0 systemd-logind[819]: New session 14 of user zuul.
Nov 22 09:44:30 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 22 09:44:30 compute-0 sshd-session[64885]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:44:31 compute-0 python3.9[65038]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:44:32 compute-0 sudo[65192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkgmnmkecymwatjsfbbgbhsygalopfnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804671.9496405-33-177393565302061/AnsiballZ_file.py'
Nov 22 09:44:32 compute-0 sudo[65192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:32 compute-0 python3.9[65194]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:32 compute-0 sudo[65192]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:33 compute-0 sudo[65367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goexqcqxlfwzbpqfnjzcmupgenkmsqqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804672.8693843-41-214295288381419/AnsiballZ_stat.py'
Nov 22 09:44:33 compute-0 sudo[65367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:33 compute-0 python3.9[65369]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:33 compute-0 sudo[65367]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:33 compute-0 sudo[65445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnwqclvodpwbxropcgprbjhsjhpxdghu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804672.8693843-41-214295288381419/AnsiballZ_file.py'
Nov 22 09:44:33 compute-0 sudo[65445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:34 compute-0 python3.9[65447]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.sidq_ld4 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:34 compute-0 sudo[65445]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:34 compute-0 sudo[65597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztetogudxtjlermglcrpwpkjwjyjvvym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804674.465216-61-121915512131313/AnsiballZ_stat.py'
Nov 22 09:44:34 compute-0 sudo[65597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:34 compute-0 python3.9[65599]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:34 compute-0 sudo[65597]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:35 compute-0 sudo[65720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrzqrvyabjgwfuihvfkegnnowfaoesyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804674.465216-61-121915512131313/AnsiballZ_copy.py'
Nov 22 09:44:35 compute-0 sudo[65720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:35 compute-0 python3.9[65722]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804674.465216-61-121915512131313/.source _original_basename=.au2eu7yw follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:35 compute-0 sudo[65720]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:36 compute-0 sudo[65872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyygzlnrlxeiggodtmlbfndctlwtuyeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804675.8250182-77-18701860970283/AnsiballZ_file.py'
Nov 22 09:44:36 compute-0 sudo[65872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:36 compute-0 python3.9[65874]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:44:36 compute-0 sudo[65872]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:36 compute-0 sudo[66024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phmiozmbrkshfauvopckimevgwrkpisj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804676.5572436-85-265932523834821/AnsiballZ_stat.py'
Nov 22 09:44:36 compute-0 sudo[66024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:37 compute-0 python3.9[66026]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:37 compute-0 sudo[66024]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:37 compute-0 sudo[66147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffsoplivemauplhwtygggrjibmqfiapl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804676.5572436-85-265932523834821/AnsiballZ_copy.py'
Nov 22 09:44:37 compute-0 sudo[66147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:37 compute-0 python3.9[66149]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804676.5572436-85-265932523834821/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:44:37 compute-0 sudo[66147]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:38 compute-0 sudo[66299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiwxabmtsoptawyddgiyvhzsjfqjppzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804677.816727-85-125651032344213/AnsiballZ_stat.py'
Nov 22 09:44:38 compute-0 sudo[66299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:38 compute-0 python3.9[66301]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:38 compute-0 sudo[66299]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:38 compute-0 sudo[66422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jomaxzsbppehhzzcvpsyarcrylzrhand ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804677.816727-85-125651032344213/AnsiballZ_copy.py'
Nov 22 09:44:38 compute-0 sudo[66422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:38 compute-0 python3.9[66424]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804677.816727-85-125651032344213/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:44:38 compute-0 sudo[66422]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:39 compute-0 sudo[66574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwlvorswgzljggalslopzukzqxzibwzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804679.1705225-114-34833953638299/AnsiballZ_file.py'
Nov 22 09:44:39 compute-0 sudo[66574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:39 compute-0 python3.9[66576]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:39 compute-0 sudo[66574]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:40 compute-0 sudo[66726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aagloxepmrfixfyeiqvtbmthghdxptrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804679.978829-122-173083399747097/AnsiballZ_stat.py'
Nov 22 09:44:40 compute-0 sudo[66726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:40 compute-0 python3.9[66728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:40 compute-0 sudo[66726]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:40 compute-0 sudo[66849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecipywbqniaihnxarjqomwmwlgbtetdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804679.978829-122-173083399747097/AnsiballZ_copy.py'
Nov 22 09:44:40 compute-0 sudo[66849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:41 compute-0 python3.9[66851]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804679.978829-122-173083399747097/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:41 compute-0 sudo[66849]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:41 compute-0 sudo[67001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycqztgqajufmvdbpsjxnjjimisgxmpin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804681.3775227-137-193373900593785/AnsiballZ_stat.py'
Nov 22 09:44:41 compute-0 sudo[67001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:41 compute-0 python3.9[67003]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:41 compute-0 sudo[67001]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:42 compute-0 sudo[67124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxbbmnjudxkuykdapbmkxosmbfuygozs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804681.3775227-137-193373900593785/AnsiballZ_copy.py'
Nov 22 09:44:42 compute-0 sudo[67124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:42 compute-0 python3.9[67126]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804681.3775227-137-193373900593785/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:42 compute-0 sudo[67124]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:43 compute-0 sudo[67276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiorvofabwocvtkebpeebnesgnashuvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804682.758591-152-266109861767775/AnsiballZ_systemd.py'
Nov 22 09:44:43 compute-0 sudo[67276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:43 compute-0 python3.9[67278]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:44:43 compute-0 systemd[1]: Reloading.
Nov 22 09:44:43 compute-0 systemd-rc-local-generator[67307]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:44:43 compute-0 systemd-sysv-generator[67310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:44:44 compute-0 systemd[1]: Reloading.
Nov 22 09:44:44 compute-0 systemd-rc-local-generator[67342]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:44:44 compute-0 systemd-sysv-generator[67346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:44:44 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 22 09:44:44 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 22 09:44:44 compute-0 sudo[67276]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:44 compute-0 sudo[67504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luaiksdogrnhzejyxbbuoahcnrkwavsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804684.5310075-160-43119872224626/AnsiballZ_stat.py'
Nov 22 09:44:44 compute-0 sudo[67504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:45 compute-0 python3.9[67506]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:45 compute-0 sudo[67504]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:45 compute-0 sudo[67627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwejhhizxprgwwmppyofgbzzssqctbci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804684.5310075-160-43119872224626/AnsiballZ_copy.py'
Nov 22 09:44:45 compute-0 sudo[67627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:45 compute-0 python3.9[67629]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804684.5310075-160-43119872224626/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:45 compute-0 sudo[67627]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:46 compute-0 sudo[67779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwrqwinwxwgqojjbqvjhpumhejlhkkqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804685.9366586-175-10931030151014/AnsiballZ_stat.py'
Nov 22 09:44:46 compute-0 sudo[67779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:46 compute-0 python3.9[67781]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:46 compute-0 sudo[67779]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:46 compute-0 sudo[67902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlubfzzpthazldvwxndepdptzqrtiven ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804685.9366586-175-10931030151014/AnsiballZ_copy.py'
Nov 22 09:44:46 compute-0 sudo[67902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:47 compute-0 python3.9[67904]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804685.9366586-175-10931030151014/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:44:47 compute-0 sudo[67902]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:47 compute-0 sudo[68054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxbjszzwlfnscicgedmiemkrkggfeaum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804687.3981483-190-115090441532984/AnsiballZ_systemd.py'
Nov 22 09:44:47 compute-0 sudo[68054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:48 compute-0 python3.9[68056]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:44:48 compute-0 systemd[1]: Reloading.
Nov 22 09:44:48 compute-0 systemd-sysv-generator[68082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:44:48 compute-0 systemd-rc-local-generator[68078]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:44:48 compute-0 systemd[1]: Reloading.
Nov 22 09:44:48 compute-0 systemd-rc-local-generator[68125]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:44:48 compute-0 systemd-sysv-generator[68129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:44:48 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 09:44:48 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 09:44:48 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 09:44:48 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 09:44:48 compute-0 sudo[68054]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:49 compute-0 python3.9[68286]: ansible-ansible.builtin.service_facts Invoked
Nov 22 09:44:49 compute-0 network[68303]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 09:44:49 compute-0 network[68304]: 'network-scripts' will be removed from distribution in near future.
Nov 22 09:44:49 compute-0 network[68305]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 09:44:54 compute-0 sudo[68565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iktnqtgryqprrhizcwubvhuavqzzzbcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804694.1876602-206-213203228882931/AnsiballZ_systemd.py'
Nov 22 09:44:54 compute-0 sudo[68565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:54 compute-0 python3.9[68567]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:44:54 compute-0 systemd[1]: Reloading.
Nov 22 09:44:55 compute-0 systemd-rc-local-generator[68598]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:44:55 compute-0 systemd-sysv-generator[68601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:44:55 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 22 09:44:55 compute-0 iptables.init[68607]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 22 09:44:55 compute-0 iptables.init[68607]: iptables: Flushing firewall rules: [  OK  ]
Nov 22 09:44:55 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 22 09:44:55 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 22 09:44:55 compute-0 sudo[68565]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:56 compute-0 sudo[68801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojcfamzsvfuapcolqvbsecqrmapkhnzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804695.7689474-206-118176733853588/AnsiballZ_systemd.py'
Nov 22 09:44:56 compute-0 sudo[68801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:56 compute-0 python3.9[68803]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:44:56 compute-0 sudo[68801]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:57 compute-0 sudo[68955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-surtxzkogggwuobhqthazinfiweybomx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804696.7794328-222-45798311555035/AnsiballZ_systemd.py'
Nov 22 09:44:57 compute-0 sudo[68955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:57 compute-0 python3.9[68957]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:44:57 compute-0 systemd[1]: Reloading.
Nov 22 09:44:57 compute-0 systemd-sysv-generator[68988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:44:57 compute-0 systemd-rc-local-generator[68983]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:44:57 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 22 09:44:57 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 22 09:44:57 compute-0 sudo[68955]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:58 compute-0 sudo[69147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdazzhejtrlzxswiklawvtixkgbjvvef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804697.942901-230-169580532333730/AnsiballZ_command.py'
Nov 22 09:44:58 compute-0 sudo[69147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:58 compute-0 python3.9[69149]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:44:58 compute-0 sudo[69147]: pam_unix(sudo:session): session closed for user root
Nov 22 09:44:59 compute-0 sudo[69300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prmxvzxggsvvrpjrkojbetwfwktbuxsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804699.2348394-244-245589514421511/AnsiballZ_stat.py'
Nov 22 09:44:59 compute-0 sudo[69300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:44:59 compute-0 python3.9[69302]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:44:59 compute-0 sudo[69300]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:00 compute-0 sudo[69425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxytjkocyffbvaenwvbbyzfzbzdekzqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804699.2348394-244-245589514421511/AnsiballZ_copy.py'
Nov 22 09:45:00 compute-0 sudo[69425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:00 compute-0 python3.9[69427]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804699.2348394-244-245589514421511/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:00 compute-0 sudo[69425]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:01 compute-0 sudo[69578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nijzntfwprwvuzspmlbvvfrdsczispcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804700.8238928-259-762594912131/AnsiballZ_systemd.py'
Nov 22 09:45:01 compute-0 sudo[69578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:01 compute-0 python3.9[69580]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:45:01 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 22 09:45:01 compute-0 sshd[1005]: Received SIGHUP; restarting.
Nov 22 09:45:01 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 22 09:45:01 compute-0 sshd[1005]: Server listening on 0.0.0.0 port 22.
Nov 22 09:45:01 compute-0 sshd[1005]: Server listening on :: port 22.
Nov 22 09:45:01 compute-0 sudo[69578]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:01 compute-0 sudo[69734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyaipopvohqmeqdlxfckgevjsyugmigo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804701.6389866-267-44234540597476/AnsiballZ_file.py'
Nov 22 09:45:01 compute-0 sudo[69734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:02 compute-0 python3.9[69736]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:02 compute-0 sudo[69734]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:02 compute-0 sudo[69886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvghqnrvahdhmvtgqjrgklkmadxgfrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804702.28404-275-177946404397059/AnsiballZ_stat.py'
Nov 22 09:45:02 compute-0 sudo[69886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:02 compute-0 python3.9[69888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:02 compute-0 sudo[69886]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:03 compute-0 sudo[70009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txyroxwhkvrtsfesbujxqbhwmthsukcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804702.28404-275-177946404397059/AnsiballZ_copy.py'
Nov 22 09:45:03 compute-0 sudo[70009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:03 compute-0 python3.9[70011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804702.28404-275-177946404397059/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:03 compute-0 sudo[70009]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:04 compute-0 sudo[70161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unrenrkttaplytneakiichdbhvndflka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804703.7764168-293-250278413685194/AnsiballZ_timezone.py'
Nov 22 09:45:04 compute-0 sudo[70161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:04 compute-0 python3.9[70163]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 09:45:04 compute-0 systemd[1]: Starting Time & Date Service...
Nov 22 09:45:04 compute-0 systemd[1]: Started Time & Date Service.
Nov 22 09:45:04 compute-0 sudo[70161]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:05 compute-0 sudo[70317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyvhokunfvdiaqmavqcsdmagxwvnfhbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804704.761415-302-98496200451087/AnsiballZ_file.py'
Nov 22 09:45:05 compute-0 sudo[70317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:05 compute-0 python3.9[70319]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:05 compute-0 sudo[70317]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:05 compute-0 sudo[70469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdqppuwawdcjtfwlqxzhyhnkgwosasqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804705.4996552-310-107322685526745/AnsiballZ_stat.py'
Nov 22 09:45:05 compute-0 sudo[70469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:06 compute-0 python3.9[70471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:06 compute-0 sudo[70469]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:06 compute-0 sudo[70592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukdqmjqvdkauqbfreqxpgdtodzkbcskk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804705.4996552-310-107322685526745/AnsiballZ_copy.py'
Nov 22 09:45:06 compute-0 sudo[70592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:06 compute-0 python3.9[70594]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804705.4996552-310-107322685526745/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:06 compute-0 sudo[70592]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:07 compute-0 sudo[70744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezcfyejeduuzmarejeuoqxrrwlvwkols ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804707.030257-325-159206392029325/AnsiballZ_stat.py'
Nov 22 09:45:07 compute-0 sudo[70744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:07 compute-0 python3.9[70746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:07 compute-0 sudo[70744]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:08 compute-0 sudo[70867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwiwnbcoiwktupqqklqpbxasauxraxir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804707.030257-325-159206392029325/AnsiballZ_copy.py'
Nov 22 09:45:08 compute-0 sudo[70867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:08 compute-0 python3.9[70869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804707.030257-325-159206392029325/.source.yaml _original_basename=.pkc1ysfl follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:08 compute-0 sudo[70867]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:09 compute-0 sudo[71019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugkammcccnmkeeysuqkiqyqnmzwpjrnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804708.6678517-340-54578540501175/AnsiballZ_stat.py'
Nov 22 09:45:09 compute-0 sudo[71019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:09 compute-0 python3.9[71021]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:09 compute-0 sudo[71019]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:09 compute-0 sudo[71142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypaxdcnyixpuuucbwofguqhttsaiyrjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804708.6678517-340-54578540501175/AnsiballZ_copy.py'
Nov 22 09:45:09 compute-0 sudo[71142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:10 compute-0 python3.9[71144]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804708.6678517-340-54578540501175/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:10 compute-0 sudo[71142]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:10 compute-0 sudo[71294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlynzdgkvhkpeztwcguehhjyvmgwazxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804710.2810166-355-176173669632772/AnsiballZ_command.py'
Nov 22 09:45:10 compute-0 sudo[71294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:10 compute-0 python3.9[71296]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:45:11 compute-0 sudo[71294]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:11 compute-0 sudo[71447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twdvbxwwndbxnmogtvmmpurwjybsbpqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804711.2493443-363-207015320618504/AnsiballZ_command.py'
Nov 22 09:45:11 compute-0 sudo[71447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:11 compute-0 python3.9[71449]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:45:11 compute-0 sudo[71447]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:12 compute-0 sudo[71600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiaixaftnciodqbrdljjnalcaoeuypnm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763804712.0609858-371-103136303916219/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 09:45:12 compute-0 sudo[71600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:12 compute-0 python3[71602]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 09:45:12 compute-0 sudo[71600]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:13 compute-0 sudo[71752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewpgjbsytdujzulgqepyhakiaszrjrre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804713.032715-379-95528914755416/AnsiballZ_stat.py'
Nov 22 09:45:13 compute-0 sudo[71752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:13 compute-0 python3.9[71754]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:13 compute-0 sudo[71752]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:14 compute-0 sudo[71875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvjnxezafiqzbpkyjudnijgpkzzkclye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804713.032715-379-95528914755416/AnsiballZ_copy.py'
Nov 22 09:45:14 compute-0 sudo[71875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:14 compute-0 python3.9[71877]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804713.032715-379-95528914755416/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:14 compute-0 sudo[71875]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:14 compute-0 sudo[72027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igoftdfqtnibmhvbewdfidwcyvlrhsin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804714.5101724-394-40170619294032/AnsiballZ_stat.py'
Nov 22 09:45:14 compute-0 sudo[72027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:15 compute-0 python3.9[72029]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:15 compute-0 sudo[72027]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:15 compute-0 sudo[72150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwodysdyjonhhvvodpwszgewhdiylozu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804714.5101724-394-40170619294032/AnsiballZ_copy.py'
Nov 22 09:45:15 compute-0 sudo[72150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:15 compute-0 python3.9[72152]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804714.5101724-394-40170619294032/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:15 compute-0 sudo[72150]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:16 compute-0 sudo[72302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evpahvcfidkyehovfsegraajyjidszko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804715.9836283-409-218807361113248/AnsiballZ_stat.py'
Nov 22 09:45:16 compute-0 sudo[72302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:16 compute-0 python3.9[72304]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:16 compute-0 sudo[72302]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:17 compute-0 sudo[72425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtqibmvvwuwraceafblcohnmulxkgdvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804715.9836283-409-218807361113248/AnsiballZ_copy.py'
Nov 22 09:45:17 compute-0 sudo[72425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:17 compute-0 python3.9[72427]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804715.9836283-409-218807361113248/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:17 compute-0 sudo[72425]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:17 compute-0 sudo[72577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzzohupecmgxejddvyrrzsinenikqpwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804717.3836267-424-135249968705559/AnsiballZ_stat.py'
Nov 22 09:45:17 compute-0 sudo[72577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:17 compute-0 python3.9[72579]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:17 compute-0 sudo[72577]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:18 compute-0 sudo[72700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlsubjwzatfjicukiuvsmtzrjybdnhzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804717.3836267-424-135249968705559/AnsiballZ_copy.py'
Nov 22 09:45:18 compute-0 sudo[72700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:18 compute-0 python3.9[72702]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804717.3836267-424-135249968705559/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:18 compute-0 sudo[72700]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:19 compute-0 sudo[72852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdpnndwuulvgrbqwetrynexcupgdzzhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804718.7264786-439-117032939421243/AnsiballZ_stat.py'
Nov 22 09:45:19 compute-0 sudo[72852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:19 compute-0 python3.9[72854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:45:19 compute-0 sudo[72852]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:19 compute-0 sudo[72975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrtnmeuuymyvidfvfsvonpifnmpmktpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804718.7264786-439-117032939421243/AnsiballZ_copy.py'
Nov 22 09:45:19 compute-0 sudo[72975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:19 compute-0 python3.9[72977]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804718.7264786-439-117032939421243/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:19 compute-0 sudo[72975]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:20 compute-0 sudo[73127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oanxfcgtvhjmzoqznptwywcxiqqjyemi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804720.1795058-454-168747909196703/AnsiballZ_file.py'
Nov 22 09:45:20 compute-0 sudo[73127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:20 compute-0 python3.9[73129]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:20 compute-0 sudo[73127]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:21 compute-0 sudo[73279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djqfbmtkecnegcyforcrqqvgcowxvzvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804720.898949-462-148000173246831/AnsiballZ_command.py'
Nov 22 09:45:21 compute-0 sudo[73279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:21 compute-0 python3.9[73281]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:45:21 compute-0 sudo[73279]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:22 compute-0 sudo[73438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhxriqpkeeuxhubsletmgeejuahyapnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804721.7544692-470-189932165224749/AnsiballZ_blockinfile.py'
Nov 22 09:45:22 compute-0 sudo[73438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:22 compute-0 python3.9[73440]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:22 compute-0 sudo[73438]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:23 compute-0 sudo[73591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcbwsxfjncxwzfpkptrdxkxywjvfxevo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804722.88584-479-32931172824296/AnsiballZ_file.py'
Nov 22 09:45:23 compute-0 sudo[73591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:23 compute-0 python3.9[73593]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:23 compute-0 sudo[73591]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:24 compute-0 sudo[73743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnqgbvvfenbgfbyomucomxdymbwfbiyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804723.7068818-479-12015862622574/AnsiballZ_file.py'
Nov 22 09:45:24 compute-0 sudo[73743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:24 compute-0 python3.9[73745]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:24 compute-0 sudo[73743]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:25 compute-0 sudo[73895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scragandagylbytdicorbumklgwzyqul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804724.4936728-494-33202489108354/AnsiballZ_mount.py'
Nov 22 09:45:25 compute-0 sudo[73895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:25 compute-0 python3.9[73897]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 09:45:25 compute-0 sudo[73895]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:25 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:45:25 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:45:25 compute-0 sudo[74049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjkaifncjeniuhwrltuoxbryoppvyawc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804725.516027-494-170631188195041/AnsiballZ_mount.py'
Nov 22 09:45:25 compute-0 sudo[74049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:26 compute-0 python3.9[74051]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 09:45:26 compute-0 sudo[74049]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:26 compute-0 sshd-session[64888]: Connection closed by 192.168.122.30 port 55398
Nov 22 09:45:26 compute-0 sshd-session[64885]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:45:26 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 22 09:45:26 compute-0 systemd[1]: session-14.scope: Consumed 39.618s CPU time.
Nov 22 09:45:26 compute-0 systemd-logind[819]: Session 14 logged out. Waiting for processes to exit.
Nov 22 09:45:26 compute-0 systemd-logind[819]: Removed session 14.
Nov 22 09:45:32 compute-0 sshd-session[74077]: Accepted publickey for zuul from 192.168.122.30 port 36788 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:45:32 compute-0 systemd-logind[819]: New session 15 of user zuul.
Nov 22 09:45:32 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 22 09:45:32 compute-0 sshd-session[74077]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:45:33 compute-0 sudo[74230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouacstwonuenhypxkqswynddywuwxibm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804732.724284-16-218488949401850/AnsiballZ_tempfile.py'
Nov 22 09:45:33 compute-0 sudo[74230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:33 compute-0 python3.9[74232]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 22 09:45:33 compute-0 sudo[74230]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:34 compute-0 sudo[74382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdwwifmgajmsmqzabgkvafmpndgizfbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804733.6665046-28-82298856017412/AnsiballZ_stat.py'
Nov 22 09:45:34 compute-0 sudo[74382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:34 compute-0 python3.9[74384]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:45:34 compute-0 sudo[74382]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:34 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 09:45:35 compute-0 sudo[74536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilxznrnxqcqndzmicgmzqoqnapxjeuhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804734.5736017-38-217803353871555/AnsiballZ_setup.py'
Nov 22 09:45:35 compute-0 sudo[74536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:35 compute-0 python3.9[74538]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:45:35 compute-0 sudo[74536]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:36 compute-0 sudo[74688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uppmvmmupwiqkzxauynrqjyadwkmcqhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804735.7735417-47-257606098092647/AnsiballZ_blockinfile.py'
Nov 22 09:45:36 compute-0 sudo[74688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:36 compute-0 python3.9[74690]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCSfTRAOMMeAe0xY971vcsuqh0+DHfyLJ3u5BIB7nr7Rn/Sn0uuYXfNoJ+ZV50udN41GTxwZFPGfz1AdGRrXGYQzOmH7xDAJTFV19pX0l04KVHlDG4pR6nBCxNAzpKxvIA+38O3NNNUaeCTn+zotw0D5D2GRJWrGtaTyhJQNTctOlzQXQI7qdRCluTs6eAWMedpMjukKy14dJzQlYHU8GFa3MjSntO+tqGZQ/qsnCONQSzEeknHi+Dmdv92AP/islbMiy8gNeRMveA/jIAXesGzykK0KJde5tcNFg4iUQ6/RWYRY0jLekU9tVNndMimK9zQqU54grwFx5Qm/LpgdcMX6QuZdvMa2q+aYqBKgWpTZKNMT0HQaPQXiuSWs53oQCiZzvn2fnnBYMcoiW0Mdwixy/q9RD2xYPsH+8BT48PlmD4Ie6MFH805dffsM9FYtOWQ5RAlsjif/O70/vKp2sxMLHFBC4/ua7+Q3LZuYrsW/4dBp0foPpmhCMLiWI6xZCE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFFM8wA1GNByN+tKdLR45WocKyCYaiaapPym0Y4/yHs4
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK7IJv/3tmlDJxzdM7BkArsSUGeejrmJrqjDx+8OOM4EabpSQviveJ57gdmaSDnwt23pylMXUh7plybg7uN+uxA=
                                             create=True mode=0644 path=/tmp/ansible.29999v_j state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:36 compute-0 sudo[74688]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:37 compute-0 sudo[74840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyefxinjbwamxdjvaolviwapjeyifynt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804736.7856944-55-240061129665586/AnsiballZ_command.py'
Nov 22 09:45:37 compute-0 sudo[74840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:37 compute-0 python3.9[74842]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.29999v_j' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:45:37 compute-0 sudo[74840]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:38 compute-0 sudo[74994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-limefqpwsrtxsdavxuxrszlkdukxgspf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804737.6788774-63-194811090102749/AnsiballZ_file.py'
Nov 22 09:45:38 compute-0 sudo[74994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:38 compute-0 python3.9[74996]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.29999v_j state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:38 compute-0 sudo[74994]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:38 compute-0 sshd-session[74080]: Connection closed by 192.168.122.30 port 36788
Nov 22 09:45:38 compute-0 sshd-session[74077]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:45:38 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 22 09:45:38 compute-0 systemd[1]: session-15.scope: Consumed 3.692s CPU time.
Nov 22 09:45:38 compute-0 systemd-logind[819]: Session 15 logged out. Waiting for processes to exit.
Nov 22 09:45:38 compute-0 systemd-logind[819]: Removed session 15.
Nov 22 09:45:41 compute-0 sshd-session[75021]: Invalid user admin from 103.56.115.6 port 46782
Nov 22 09:45:41 compute-0 sshd-session[75021]: Connection closed by invalid user admin 103.56.115.6 port 46782 [preauth]
Nov 22 09:45:44 compute-0 sshd-session[75023]: Accepted publickey for zuul from 192.168.122.30 port 34346 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:45:44 compute-0 systemd-logind[819]: New session 16 of user zuul.
Nov 22 09:45:44 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 22 09:45:44 compute-0 sshd-session[75023]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:45:45 compute-0 python3.9[75176]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:45:46 compute-0 sudo[75330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozgkxrasnmdnzurmsjthglndveoucnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804746.0776913-32-85618306748739/AnsiballZ_systemd.py'
Nov 22 09:45:46 compute-0 sudo[75330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:46 compute-0 python3.9[75332]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 09:45:47 compute-0 sudo[75330]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:47 compute-0 sudo[75484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clhivvfzmhglhisxjlznebrhpklejswh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804747.2497926-40-169479585814834/AnsiballZ_systemd.py'
Nov 22 09:45:47 compute-0 sudo[75484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:47 compute-0 python3.9[75486]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:45:47 compute-0 sudo[75484]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:48 compute-0 sudo[75637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldpclyywplhmjggyzmfptkwzvgdigiuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804748.202754-49-260885868898487/AnsiballZ_command.py'
Nov 22 09:45:48 compute-0 sudo[75637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:48 compute-0 python3.9[75639]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:45:48 compute-0 sudo[75637]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:49 compute-0 sudo[75790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gglontpcsmcaznvcgkbaaprlyrubcvrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804749.0643725-57-156177748685359/AnsiballZ_stat.py'
Nov 22 09:45:49 compute-0 sudo[75790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:49 compute-0 python3.9[75792]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:45:49 compute-0 sudo[75790]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:50 compute-0 sudo[75944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swketkkdayeenaonoaqnofrbxwojtvia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804749.904127-65-179926720892633/AnsiballZ_command.py'
Nov 22 09:45:50 compute-0 sudo[75944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:50 compute-0 python3.9[75946]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:45:50 compute-0 sudo[75944]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:51 compute-0 sudo[76099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftpdnhogikmvfrfmtpazqhjvxvisaarf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804750.660169-73-275424985741920/AnsiballZ_file.py'
Nov 22 09:45:51 compute-0 sudo[76099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:51 compute-0 python3.9[76101]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:45:51 compute-0 sudo[76099]: pam_unix(sudo:session): session closed for user root
Nov 22 09:45:51 compute-0 sshd-session[75026]: Connection closed by 192.168.122.30 port 34346
Nov 22 09:45:51 compute-0 sshd-session[75023]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:45:51 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 22 09:45:51 compute-0 systemd[1]: session-16.scope: Consumed 4.621s CPU time.
Nov 22 09:45:51 compute-0 systemd-logind[819]: Session 16 logged out. Waiting for processes to exit.
Nov 22 09:45:51 compute-0 systemd-logind[819]: Removed session 16.
Nov 22 09:45:57 compute-0 sshd-session[76126]: Accepted publickey for zuul from 192.168.122.30 port 43950 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:45:57 compute-0 systemd-logind[819]: New session 17 of user zuul.
Nov 22 09:45:57 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 22 09:45:57 compute-0 sshd-session[76126]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:45:58 compute-0 python3.9[76279]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:45:59 compute-0 sudo[76433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jciahmvdhgcanxyseqameidgdgohhcmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804759.015241-34-139231490617515/AnsiballZ_setup.py'
Nov 22 09:45:59 compute-0 sudo[76433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:45:59 compute-0 python3.9[76435]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:45:59 compute-0 sudo[76433]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:00 compute-0 sudo[76517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdsavvrfnhvgzdrghylevszskrgwwciq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804759.015241-34-139231490617515/AnsiballZ_dnf.py'
Nov 22 09:46:00 compute-0 sudo[76517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:00 compute-0 python3.9[76519]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 09:46:02 compute-0 sudo[76517]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:03 compute-0 python3.9[76670]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:46:04 compute-0 python3.9[76821]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 09:46:05 compute-0 python3.9[76971]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:46:06 compute-0 python3.9[77121]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:46:06 compute-0 sshd-session[76129]: Connection closed by 192.168.122.30 port 43950
Nov 22 09:46:06 compute-0 sshd-session[76126]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:46:06 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 22 09:46:06 compute-0 systemd[1]: session-17.scope: Consumed 6.545s CPU time.
Nov 22 09:46:06 compute-0 systemd-logind[819]: Session 17 logged out. Waiting for processes to exit.
Nov 22 09:46:06 compute-0 systemd-logind[819]: Removed session 17.
Nov 22 09:46:17 compute-0 sshd-session[77146]: Accepted publickey for zuul from 192.168.122.30 port 44546 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:46:17 compute-0 systemd-logind[819]: New session 18 of user zuul.
Nov 22 09:46:17 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 22 09:46:17 compute-0 sshd-session[77146]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:46:18 compute-0 python3.9[77299]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:46:20 compute-0 sudo[77453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxdlklsqhewnjbiujdflssrssdsfgogw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804779.96496-50-227860163973341/AnsiballZ_file.py'
Nov 22 09:46:20 compute-0 sudo[77453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:20 compute-0 python3.9[77455]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:20 compute-0 sudo[77453]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:21 compute-0 sudo[77605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzqhepfdjoknfazvtdjwbgqxwxrfguer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804780.800086-50-192320870610014/AnsiballZ_file.py'
Nov 22 09:46:21 compute-0 sudo[77605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:21 compute-0 python3.9[77607]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:21 compute-0 sudo[77605]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:21 compute-0 sudo[77757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzzbbwcmizwvuwysfjoifhyymcdqqgel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804781.5228472-65-64313290400879/AnsiballZ_stat.py'
Nov 22 09:46:21 compute-0 sudo[77757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:22 compute-0 python3.9[77759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:22 compute-0 sudo[77757]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:22 compute-0 sudo[77880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqawgjpdyxafidnvsmegumfgdzufswpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804781.5228472-65-64313290400879/AnsiballZ_copy.py'
Nov 22 09:46:22 compute-0 sudo[77880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:23 compute-0 python3.9[77882]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804781.5228472-65-64313290400879/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=2fe18c6115404d3257cc393c0fb4c9c6ec14dcf6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:23 compute-0 sudo[77880]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:23 compute-0 sudo[78032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynnnvvzmxdrxidgzexllyyydoitczkve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804783.3010848-65-174639019539761/AnsiballZ_stat.py'
Nov 22 09:46:23 compute-0 sudo[78032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:23 compute-0 python3.9[78034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:23 compute-0 sudo[78032]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:24 compute-0 sudo[78155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfzbggfwifjzbjpfdlkagkutztgvaxcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804783.3010848-65-174639019539761/AnsiballZ_copy.py'
Nov 22 09:46:24 compute-0 sudo[78155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:24 compute-0 python3.9[78157]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804783.3010848-65-174639019539761/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=b8a129474268143a485788cdffa6b04690f0835a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:24 compute-0 sudo[78155]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:25 compute-0 sudo[78307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfrrkexmeurfpjmojwjgayxhcwkjgltg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804784.9963791-65-178745150581864/AnsiballZ_stat.py'
Nov 22 09:46:25 compute-0 sudo[78307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:25 compute-0 python3.9[78309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:25 compute-0 sudo[78307]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:26 compute-0 sudo[78430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsmfxhihrxcammyjrabiyohloqlsnsko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804784.9963791-65-178745150581864/AnsiballZ_copy.py'
Nov 22 09:46:26 compute-0 sudo[78430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:26 compute-0 python3.9[78432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804784.9963791-65-178745150581864/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=72da8fa55a8a0abfb61a7addcc2a09779219f9a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:26 compute-0 sudo[78430]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:26 compute-0 sudo[78582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mffjthuomacjxcvviagxitdjjenqrtgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804786.5191858-109-215965822332346/AnsiballZ_file.py'
Nov 22 09:46:26 compute-0 sudo[78582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:27 compute-0 python3.9[78584]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:27 compute-0 sudo[78582]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:27 compute-0 sudo[78734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hofcecrirblhnzcmrpdvgwwiwbhuyiho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804787.2719817-109-162001376407177/AnsiballZ_file.py'
Nov 22 09:46:27 compute-0 sudo[78734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:27 compute-0 python3.9[78736]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:27 compute-0 sudo[78734]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:28 compute-0 sudo[78886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwshistnjumnxgsklikxjezvnfiatsft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804788.0316408-124-225277759561078/AnsiballZ_stat.py'
Nov 22 09:46:28 compute-0 sudo[78886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:28 compute-0 python3.9[78888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:28 compute-0 sudo[78886]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:28 compute-0 sudo[79009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzftqysfyxnzmfesctduzdyjxdbwlrsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804788.0316408-124-225277759561078/AnsiballZ_copy.py'
Nov 22 09:46:28 compute-0 sudo[79009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:29 compute-0 python3.9[79011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804788.0316408-124-225277759561078/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=72e952aa81388e7e95706e1a73ed6fcbc455dbdf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:29 compute-0 sudo[79009]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:29 compute-0 sudo[79161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqevmeswefijdxsdquawnqwwpiwrbmzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804789.2016418-124-264671352723576/AnsiballZ_stat.py'
Nov 22 09:46:29 compute-0 sudo[79161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:29 compute-0 python3.9[79163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:29 compute-0 sudo[79161]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:30 compute-0 sudo[79284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peafgtbihqzdgbbqycferghnttxjbgpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804789.2016418-124-264671352723576/AnsiballZ_copy.py'
Nov 22 09:46:30 compute-0 sudo[79284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:30 compute-0 python3.9[79286]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804789.2016418-124-264671352723576/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=94a35ff86648e83c820c5a13af929cdd22ed4940 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:30 compute-0 sudo[79284]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:30 compute-0 sudo[79436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxynohyjycdwxaigpbaaggglvclfkhnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804790.4371972-124-38064185536718/AnsiballZ_stat.py'
Nov 22 09:46:30 compute-0 sudo[79436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:30 compute-0 python3.9[79438]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:30 compute-0 sudo[79436]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:31 compute-0 sudo[79559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdeojqvdcwylqopblbbcgsyffbsqncoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804790.4371972-124-38064185536718/AnsiballZ_copy.py'
Nov 22 09:46:31 compute-0 sudo[79559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:31 compute-0 python3.9[79561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804790.4371972-124-38064185536718/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a2dc2b92cc4a67c64b06d7d68a61cce11e6649e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:31 compute-0 sudo[79559]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:31 compute-0 sudo[79711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnbajvqzkhghzhpbvegighsvdcmsoull ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804791.6873696-168-204612061505981/AnsiballZ_file.py'
Nov 22 09:46:31 compute-0 sudo[79711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:32 compute-0 python3.9[79713]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:32 compute-0 sudo[79711]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:32 compute-0 sudo[79863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emqmuzxlpdwucvoboottqtplydfmhvhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804792.2941148-168-81023387411687/AnsiballZ_file.py'
Nov 22 09:46:32 compute-0 sudo[79863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:32 compute-0 python3.9[79865]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:32 compute-0 sudo[79863]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:33 compute-0 sudo[80015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsgwwiwusghhkyqahyragjnyhscknpkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804792.9444132-183-198137624710277/AnsiballZ_stat.py'
Nov 22 09:46:33 compute-0 sudo[80015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:33 compute-0 python3.9[80017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:33 compute-0 sudo[80015]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:33 compute-0 chronyd[64858]: Selected source 149.56.19.163 (pool.ntp.org)
Nov 22 09:46:33 compute-0 sudo[80138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlumfdspsispvoibyagjpvctkaizjcjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804792.9444132-183-198137624710277/AnsiballZ_copy.py'
Nov 22 09:46:33 compute-0 sudo[80138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:34 compute-0 python3.9[80140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804792.9444132-183-198137624710277/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=235db774a6ebe9edfc197372de4fb21ce7de5b5b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:34 compute-0 sudo[80138]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:34 compute-0 sudo[80290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvvqfawudymglehosvaeeipcfijvncvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804794.2162104-183-120524130397396/AnsiballZ_stat.py'
Nov 22 09:46:34 compute-0 sudo[80290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:34 compute-0 python3.9[80292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:34 compute-0 sudo[80290]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:35 compute-0 sudo[80413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmrdntfcbkvsjbclldrfvgautndxvojd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804794.2162104-183-120524130397396/AnsiballZ_copy.py'
Nov 22 09:46:35 compute-0 sudo[80413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:35 compute-0 python3.9[80415]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804794.2162104-183-120524130397396/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=655cb7ec6765ae7da7b7aec3220a36fd244541a3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:35 compute-0 sudo[80413]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:35 compute-0 sudo[80565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnsmphtznrcfjdkwpixenkiszmltfbqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804795.3791616-183-119341482878661/AnsiballZ_stat.py'
Nov 22 09:46:35 compute-0 sudo[80565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:35 compute-0 python3.9[80567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:35 compute-0 sudo[80565]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:36 compute-0 sudo[80688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezzrinwowsgsovdsccgbbfujpwohwidb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804795.3791616-183-119341482878661/AnsiballZ_copy.py'
Nov 22 09:46:36 compute-0 sudo[80688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:36 compute-0 python3.9[80690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804795.3791616-183-119341482878661/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=567f66ec8b02fab7c8036c082f4f6aa6cbea1e7c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:36 compute-0 sudo[80688]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:36 compute-0 sudo[80840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phexzrcowfljftvsshvcprxphdraamsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804796.6936865-227-169767774724681/AnsiballZ_file.py'
Nov 22 09:46:36 compute-0 sudo[80840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:37 compute-0 python3.9[80842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:37 compute-0 sudo[80840]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:37 compute-0 sudo[80992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsnirvwpwnenxdsvjauiviwrtphqyrui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804797.2801769-227-835407704141/AnsiballZ_file.py'
Nov 22 09:46:37 compute-0 sudo[80992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:37 compute-0 python3.9[80994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:37 compute-0 sudo[80992]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:38 compute-0 sudo[81144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pawswbmskuogufgnjuvwqmrmdhwkvffs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804797.8935988-242-147683168901161/AnsiballZ_stat.py'
Nov 22 09:46:38 compute-0 sudo[81144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:38 compute-0 python3.9[81146]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:38 compute-0 sudo[81144]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:38 compute-0 sudo[81267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyixusrdyhtgxdizphmrhllhellmiqex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804797.8935988-242-147683168901161/AnsiballZ_copy.py'
Nov 22 09:46:38 compute-0 sudo[81267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:38 compute-0 python3.9[81269]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804797.8935988-242-147683168901161/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c89fc24e2f42e47273c8e0d911e37202a344cd66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:38 compute-0 sudo[81267]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:39 compute-0 sudo[81419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrjubvewifycdkcthxdfyhfxtgmjhuzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804799.0679178-242-171731911380048/AnsiballZ_stat.py'
Nov 22 09:46:39 compute-0 sudo[81419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:39 compute-0 python3.9[81421]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:39 compute-0 sudo[81419]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:40 compute-0 sudo[81542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqjktoeyelzcvouxlhnkmmhnpknjefkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804799.0679178-242-171731911380048/AnsiballZ_copy.py'
Nov 22 09:46:40 compute-0 sudo[81542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:40 compute-0 python3.9[81544]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804799.0679178-242-171731911380048/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=655cb7ec6765ae7da7b7aec3220a36fd244541a3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:40 compute-0 sudo[81542]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:40 compute-0 sudo[81694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqdgrpngfotlsogfbetsyqfiyjvtsoiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804800.396644-242-134823843273734/AnsiballZ_stat.py'
Nov 22 09:46:40 compute-0 sudo[81694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:40 compute-0 python3.9[81696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:40 compute-0 sudo[81694]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:41 compute-0 sudo[81817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slgjsctvejpxrrafgozkwhsbbhhcyusm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804800.396644-242-134823843273734/AnsiballZ_copy.py'
Nov 22 09:46:41 compute-0 sudo[81817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:41 compute-0 python3.9[81819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804800.396644-242-134823843273734/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=8c8714e2648d17a9d501f3d4e7576dc0a0b9f9b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:41 compute-0 sudo[81817]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:42 compute-0 sudo[81969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzldjmhrqnuzukynzkmjmoekmpjwtwug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804802.2941616-302-101056126266160/AnsiballZ_file.py'
Nov 22 09:46:42 compute-0 sudo[81969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:42 compute-0 python3.9[81971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:43 compute-0 sudo[81969]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:43 compute-0 sudo[82121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajcruhucmohaamhobllwmlgblzpnzrnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804803.216015-310-42012273559813/AnsiballZ_stat.py'
Nov 22 09:46:43 compute-0 sudo[82121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:43 compute-0 python3.9[82123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:43 compute-0 sudo[82121]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:44 compute-0 sudo[82244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fepoqgzuntwprldopqkuycwxvwvjeuoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804803.216015-310-42012273559813/AnsiballZ_copy.py'
Nov 22 09:46:44 compute-0 sudo[82244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:44 compute-0 python3.9[82246]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804803.216015-310-42012273559813/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:44 compute-0 sudo[82244]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:44 compute-0 sudo[82396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brnxqnkjwcunosardhjhjnecgnvqynyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804804.4720025-326-92094494726903/AnsiballZ_file.py'
Nov 22 09:46:44 compute-0 sudo[82396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:44 compute-0 python3.9[82398]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:44 compute-0 sudo[82396]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:45 compute-0 sudo[82548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbeuzzzmmpxhloazmvuefgcgaffwotbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804805.1465719-334-128516114659994/AnsiballZ_stat.py'
Nov 22 09:46:45 compute-0 sudo[82548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:45 compute-0 python3.9[82550]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:45 compute-0 sudo[82548]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:46 compute-0 sudo[82671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aonzlcdicpzpahskkcscfugdtaaylvqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804805.1465719-334-128516114659994/AnsiballZ_copy.py'
Nov 22 09:46:46 compute-0 sudo[82671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:46 compute-0 python3.9[82673]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804805.1465719-334-128516114659994/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:46 compute-0 sudo[82671]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:47 compute-0 sudo[82823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvidedhyfhqggvpvywqghgkbgoesyrwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804806.678846-350-163193274666156/AnsiballZ_file.py'
Nov 22 09:46:47 compute-0 sudo[82823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:47 compute-0 python3.9[82825]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:47 compute-0 sudo[82823]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:47 compute-0 sudo[82975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssnpehdwkjflgkengqmrcurfkcarcaep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804807.4564228-358-273512113000051/AnsiballZ_stat.py'
Nov 22 09:46:47 compute-0 sudo[82975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:48 compute-0 python3.9[82977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:48 compute-0 sudo[82975]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:48 compute-0 sudo[83098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugxmkbeffftpjhhhhqucilkswxdzhqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804807.4564228-358-273512113000051/AnsiballZ_copy.py'
Nov 22 09:46:48 compute-0 sudo[83098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:48 compute-0 python3.9[83100]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804807.4564228-358-273512113000051/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:48 compute-0 sudo[83098]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:49 compute-0 sudo[83250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djrnozrcljgnzeyywyaktipdcnhahrps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804808.9013262-374-181275758126445/AnsiballZ_file.py'
Nov 22 09:46:49 compute-0 sudo[83250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:49 compute-0 python3.9[83252]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:49 compute-0 sudo[83250]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:49 compute-0 sudo[83402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvcszzgbauweybbvfohjbgdutzijwftk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804809.6010346-382-50071139606891/AnsiballZ_stat.py'
Nov 22 09:46:49 compute-0 sudo[83402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:50 compute-0 python3.9[83404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:50 compute-0 sudo[83402]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:50 compute-0 sudo[83525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eccrptuppmkrfnxhwtpgidxvaclgccho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804809.6010346-382-50071139606891/AnsiballZ_copy.py'
Nov 22 09:46:50 compute-0 sudo[83525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:50 compute-0 python3.9[83527]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804809.6010346-382-50071139606891/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:50 compute-0 sudo[83525]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:51 compute-0 sudo[83677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjmmwkynpyvfziziuvrbgmbdxkftaqsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804811.1690407-398-212873872122485/AnsiballZ_file.py'
Nov 22 09:46:51 compute-0 sudo[83677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:51 compute-0 python3.9[83679]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:51 compute-0 sudo[83677]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:52 compute-0 sudo[83829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiziabxrnifebcpzwuqttmrxflsgrzla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804811.8628566-406-203406005355871/AnsiballZ_stat.py'
Nov 22 09:46:52 compute-0 sudo[83829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:52 compute-0 python3.9[83831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:52 compute-0 sudo[83829]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:52 compute-0 sudo[83952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntlqlkczyjtonthphubzxnbbvxzrsgav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804811.8628566-406-203406005355871/AnsiballZ_copy.py'
Nov 22 09:46:52 compute-0 sudo[83952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:53 compute-0 python3.9[83954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804811.8628566-406-203406005355871/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:53 compute-0 sudo[83952]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:53 compute-0 sudo[84104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gweqibtxlyqoyftwlrjvohwybxjaehyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804813.480548-422-126398914213774/AnsiballZ_file.py'
Nov 22 09:46:53 compute-0 sudo[84104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:54 compute-0 python3.9[84106]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:54 compute-0 sudo[84104]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:54 compute-0 sudo[84256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlfciqllexkldkplivzqauievwpkaply ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804814.260855-430-254382196097236/AnsiballZ_stat.py'
Nov 22 09:46:54 compute-0 sudo[84256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:54 compute-0 python3.9[84258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:54 compute-0 sudo[84256]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:55 compute-0 sudo[84379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eugyrkenfitldfxyskvhqjlhjnideizi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804814.260855-430-254382196097236/AnsiballZ_copy.py'
Nov 22 09:46:55 compute-0 sudo[84379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:55 compute-0 python3.9[84381]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804814.260855-430-254382196097236/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:55 compute-0 sudo[84379]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:56 compute-0 sudo[84531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbcsbkzgtqdlozouadifufuzbijzuknc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804815.70214-446-267311425391853/AnsiballZ_file.py'
Nov 22 09:46:56 compute-0 sudo[84531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:56 compute-0 python3.9[84533]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:46:56 compute-0 sudo[84531]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:56 compute-0 sudo[84683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyvdcqehhxjmyrhapeadgnhvmmrdtaxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804816.4720232-454-68726457987230/AnsiballZ_stat.py'
Nov 22 09:46:56 compute-0 sudo[84683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:57 compute-0 python3.9[84685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:46:57 compute-0 sudo[84683]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:57 compute-0 sudo[84806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qampfmborlyzlgfurycyyurscvnjgyou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804816.4720232-454-68726457987230/AnsiballZ_copy.py'
Nov 22 09:46:57 compute-0 sudo[84806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:46:57 compute-0 python3.9[84808]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804816.4720232-454-68726457987230/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=055ba4bab0d3961120a057b550a73cdd0a7df715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:46:57 compute-0 sudo[84806]: pam_unix(sudo:session): session closed for user root
Nov 22 09:46:58 compute-0 sshd-session[77149]: Connection closed by 192.168.122.30 port 44546
Nov 22 09:46:58 compute-0 sshd-session[77146]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:46:58 compute-0 systemd-logind[819]: Session 18 logged out. Waiting for processes to exit.
Nov 22 09:46:58 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 22 09:46:58 compute-0 systemd[1]: session-18.scope: Consumed 29.797s CPU time.
Nov 22 09:46:58 compute-0 systemd-logind[819]: Removed session 18.
Nov 22 09:47:03 compute-0 sshd-session[84833]: Accepted publickey for zuul from 192.168.122.30 port 51850 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:47:03 compute-0 systemd-logind[819]: New session 19 of user zuul.
Nov 22 09:47:03 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 22 09:47:03 compute-0 sshd-session[84833]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:47:04 compute-0 python3.9[84986]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:47:05 compute-0 sudo[85140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syxeyibfrnjdadstgpzggqtplhfklkkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804824.9063413-34-172639872127010/AnsiballZ_file.py'
Nov 22 09:47:05 compute-0 sudo[85140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:05 compute-0 python3.9[85142]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:05 compute-0 sudo[85140]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:06 compute-0 sudo[85292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baqaxgmvggwpuufikwfjpvsulbgxdvza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804825.7245977-34-226662253617676/AnsiballZ_file.py'
Nov 22 09:47:06 compute-0 sudo[85292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:06 compute-0 python3.9[85294]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:06 compute-0 sudo[85292]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:07 compute-0 python3.9[85444]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:47:07 compute-0 sudo[85594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivxoeaoftnhaekrosuntduyjqclozrfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804827.4323418-57-245302034646109/AnsiballZ_seboolean.py'
Nov 22 09:47:07 compute-0 sudo[85594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:08 compute-0 python3.9[85596]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 09:47:09 compute-0 sudo[85594]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:10 compute-0 sudo[85750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddagvmqhtxadmjldcxkaenbfzhrsjdwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804829.7850819-67-109997059411289/AnsiballZ_setup.py'
Nov 22 09:47:10 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 22 09:47:10 compute-0 sudo[85750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:10 compute-0 python3.9[85752]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:47:10 compute-0 sudo[85750]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:11 compute-0 sudo[85834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnmcclffksdsbfxvcsohboqqtjspucfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804829.7850819-67-109997059411289/AnsiballZ_dnf.py'
Nov 22 09:47:11 compute-0 sudo[85834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:11 compute-0 python3.9[85836]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:47:12 compute-0 sudo[85834]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:13 compute-0 sudo[85987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbbtinxkrghhvnirjspvacxotpeceiow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804833.1015666-79-30634310954524/AnsiballZ_systemd.py'
Nov 22 09:47:13 compute-0 sudo[85987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:14 compute-0 python3.9[85989]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:47:14 compute-0 sudo[85987]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:14 compute-0 sudo[86142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwwseiwzdxmptihxchqstfojgnhhnkwd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763804834.5003765-87-8782519841203/AnsiballZ_edpm_nftables_snippet.py'
Nov 22 09:47:14 compute-0 sudo[86142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:15 compute-0 python3[86144]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 22 09:47:15 compute-0 sudo[86142]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:15 compute-0 sudo[86294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnnmhzcmhrsiepegmjydigjaisugvjmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804835.485843-96-140828699276172/AnsiballZ_file.py'
Nov 22 09:47:15 compute-0 sudo[86294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:15 compute-0 python3.9[86296]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:15 compute-0 sudo[86294]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:16 compute-0 sudo[86446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vziocvpvjpjvnowttcqtwzluwgitqtim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804836.1302502-104-185080523888748/AnsiballZ_stat.py'
Nov 22 09:47:16 compute-0 sudo[86446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:16 compute-0 python3.9[86448]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:16 compute-0 sudo[86446]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:17 compute-0 sudo[86524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxekvffwxjxobkmkabemvedljjhywvon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804836.1302502-104-185080523888748/AnsiballZ_file.py'
Nov 22 09:47:17 compute-0 sudo[86524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:17 compute-0 python3.9[86526]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:17 compute-0 sudo[86524]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:17 compute-0 sudo[86676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiscsscfsxiewbhbjcaaipkxhklwkmkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804837.62509-116-227986351025242/AnsiballZ_stat.py'
Nov 22 09:47:17 compute-0 sudo[86676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:18 compute-0 python3.9[86678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:18 compute-0 sudo[86676]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:18 compute-0 sudo[86754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybsnskjzfvcrvuuhplluhaftgbsjkocd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804837.62509-116-227986351025242/AnsiballZ_file.py'
Nov 22 09:47:18 compute-0 sudo[86754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:18 compute-0 python3.9[86756]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mufozvrz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:18 compute-0 sudo[86754]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:19 compute-0 sudo[86906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvbflpwvrzylhlwaeipgkqdzkkbxmpwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804838.8999872-128-143647794161830/AnsiballZ_stat.py'
Nov 22 09:47:19 compute-0 sudo[86906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:19 compute-0 python3.9[86908]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:19 compute-0 sudo[86906]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:19 compute-0 sudo[86984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnmlgbzevjpwubbvgchexbdkvgxuvyxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804838.8999872-128-143647794161830/AnsiballZ_file.py'
Nov 22 09:47:19 compute-0 sudo[86984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:19 compute-0 python3.9[86986]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:19 compute-0 sudo[86984]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:20 compute-0 sudo[87136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjejzefxwahsxppgyxoqfooplwpzmfca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804840.199555-141-220354757670247/AnsiballZ_command.py'
Nov 22 09:47:20 compute-0 sudo[87136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:20 compute-0 python3.9[87138]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:47:20 compute-0 sudo[87136]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:21 compute-0 sudo[87289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amzipkenlwrdjeexhsjfcvmbynkosxav ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763804841.1317437-149-178117938699938/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 09:47:21 compute-0 sudo[87289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:21 compute-0 python3[87291]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 09:47:21 compute-0 sudo[87289]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:22 compute-0 sudo[87441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxvgywiaodhnlfxuvohmfcfrxwnnmxvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804842.1151662-157-7933211322707/AnsiballZ_stat.py'
Nov 22 09:47:22 compute-0 sudo[87441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:22 compute-0 python3.9[87443]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:22 compute-0 sudo[87441]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:23 compute-0 sudo[87566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocwbrjrvrtnvcgvneumametzyidtdxhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804842.1151662-157-7933211322707/AnsiballZ_copy.py'
Nov 22 09:47:23 compute-0 sudo[87566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:23 compute-0 python3.9[87568]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804842.1151662-157-7933211322707/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:23 compute-0 sudo[87566]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:24 compute-0 sudo[87718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgohjsbfmtcjzrjytubvzvwdjnqfkwxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804843.7804089-172-176096414745265/AnsiballZ_stat.py'
Nov 22 09:47:24 compute-0 sudo[87718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:24 compute-0 python3.9[87720]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:24 compute-0 sudo[87718]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:24 compute-0 sudo[87843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuydjxmppmzkenuoejvdwojfhfafwiwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804843.7804089-172-176096414745265/AnsiballZ_copy.py'
Nov 22 09:47:24 compute-0 sudo[87843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:24 compute-0 python3.9[87845]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804843.7804089-172-176096414745265/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:24 compute-0 sudo[87843]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:25 compute-0 sudo[87995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgoretbbrewdjmesvgtlunhzuxgyxhyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804845.1365786-187-134435236109463/AnsiballZ_stat.py'
Nov 22 09:47:25 compute-0 sudo[87995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:25 compute-0 python3.9[87997]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:25 compute-0 sudo[87995]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:26 compute-0 sudo[88120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tglhvrgpviqhlnstqioupnbriuihvhxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804845.1365786-187-134435236109463/AnsiballZ_copy.py'
Nov 22 09:47:26 compute-0 sudo[88120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:26 compute-0 python3.9[88122]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804845.1365786-187-134435236109463/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:26 compute-0 sudo[88120]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:27 compute-0 sudo[88272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bptyszijdzknzbonxphivazrcmoeylak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804846.7051291-202-65719986610792/AnsiballZ_stat.py'
Nov 22 09:47:27 compute-0 sudo[88272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:27 compute-0 python3.9[88274]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:27 compute-0 sudo[88272]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:27 compute-0 sudo[88397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsdrbrxcvqbynvruwqyrkfharaugszbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804846.7051291-202-65719986610792/AnsiballZ_copy.py'
Nov 22 09:47:27 compute-0 sudo[88397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:27 compute-0 python3.9[88399]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804846.7051291-202-65719986610792/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:28 compute-0 sudo[88397]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:28 compute-0 sudo[88549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doaldwbhqmaxvyvleepsdmzdfvecxame ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804848.188999-217-181033283773500/AnsiballZ_stat.py'
Nov 22 09:47:28 compute-0 sudo[88549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:28 compute-0 python3.9[88551]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:28 compute-0 sudo[88549]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:29 compute-0 sudo[88674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewloutsuxzlvrwsrumrwmlwzblyljojv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804848.188999-217-181033283773500/AnsiballZ_copy.py'
Nov 22 09:47:29 compute-0 sudo[88674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:29 compute-0 python3.9[88676]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763804848.188999-217-181033283773500/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:29 compute-0 sudo[88674]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:29 compute-0 sudo[88826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnvwpfllogwifmioyxqhwwvxxrexkzdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804849.6416323-232-276311207625483/AnsiballZ_file.py'
Nov 22 09:47:29 compute-0 sudo[88826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:30 compute-0 python3.9[88828]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:30 compute-0 sudo[88826]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:30 compute-0 sudo[88978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffweciydknjxftqoioaddnxapfumldab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804850.413631-240-235891749154616/AnsiballZ_command.py'
Nov 22 09:47:30 compute-0 sudo[88978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:30 compute-0 python3.9[88980]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:47:31 compute-0 sudo[88978]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:31 compute-0 sudo[89133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdrwsiqyzcfobbcefsnczdvdndctdbuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804851.2417815-248-103796213351272/AnsiballZ_blockinfile.py'
Nov 22 09:47:31 compute-0 sudo[89133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:31 compute-0 python3.9[89135]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:31 compute-0 sudo[89133]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:32 compute-0 sudo[89285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woxstclutrmgibtsjlunbbxeajrebdtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804852.2496405-257-256964246985829/AnsiballZ_command.py'
Nov 22 09:47:32 compute-0 sudo[89285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:32 compute-0 python3.9[89287]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:47:32 compute-0 sudo[89285]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:33 compute-0 sudo[89438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldnntkritocsntgykvpswhvqfjlwwlhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804853.1491704-265-116404117726137/AnsiballZ_stat.py'
Nov 22 09:47:33 compute-0 sudo[89438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:33 compute-0 python3.9[89440]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:47:33 compute-0 sudo[89438]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:34 compute-0 sudo[89592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xghkxnvotxapjgcugvzazjakoabisbsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804853.919281-273-179518774109461/AnsiballZ_command.py'
Nov 22 09:47:34 compute-0 sudo[89592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:34 compute-0 python3.9[89594]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:47:34 compute-0 sudo[89592]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:35 compute-0 sudo[89747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dijdvbzjswfabatmarilwxiizghsomuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804854.7215948-281-222466118040574/AnsiballZ_file.py'
Nov 22 09:47:35 compute-0 sudo[89747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:35 compute-0 python3.9[89749]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:35 compute-0 sudo[89747]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:36 compute-0 python3.9[89899]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:47:37 compute-0 sudo[90050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezdtluugnbgyaottsinvnkkdhhogtup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804857.1781483-321-165639829696186/AnsiballZ_command.py'
Nov 22 09:47:37 compute-0 sudo[90050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:37 compute-0 python3.9[90052]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:47:37 compute-0 ovs-vsctl[90053]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 22 09:47:37 compute-0 sudo[90050]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:38 compute-0 sudo[90203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmrojuxzjwtqtjqdeykncxataygjsvaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804858.0632386-330-158674837230001/AnsiballZ_command.py'
Nov 22 09:47:38 compute-0 sudo[90203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:38 compute-0 python3.9[90205]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:47:38 compute-0 sudo[90203]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:39 compute-0 sudo[90358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpevxpzahcztkthhfnuehwdeijqevzpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804858.786732-338-171983470944843/AnsiballZ_command.py'
Nov 22 09:47:39 compute-0 sudo[90358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:39 compute-0 python3.9[90360]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:47:39 compute-0 ovs-vsctl[90361]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 22 09:47:39 compute-0 sudo[90358]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:39 compute-0 python3.9[90511]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:47:40 compute-0 sudo[90663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyozvlrbbhigfebponhkzkhijhkftdam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804860.1799664-355-254919225917336/AnsiballZ_file.py'
Nov 22 09:47:40 compute-0 sudo[90663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:40 compute-0 python3.9[90665]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:40 compute-0 sudo[90663]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:41 compute-0 sudo[90815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzseqlyvishovlpuclnjxipydoevybbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804860.9279056-363-107517485467018/AnsiballZ_stat.py'
Nov 22 09:47:41 compute-0 sudo[90815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:41 compute-0 python3.9[90817]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:41 compute-0 sudo[90815]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:41 compute-0 sudo[90893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdbznllakcibratqbntllznziniotxaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804860.9279056-363-107517485467018/AnsiballZ_file.py'
Nov 22 09:47:41 compute-0 sudo[90893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:41 compute-0 python3.9[90895]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:41 compute-0 sudo[90893]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:42 compute-0 sudo[91045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-circlhdgjdxcnmpvuyygmfusuuzpigzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804862.1398106-363-210522297443627/AnsiballZ_stat.py'
Nov 22 09:47:42 compute-0 sudo[91045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:42 compute-0 python3.9[91047]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:42 compute-0 sudo[91045]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:43 compute-0 sudo[91123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlrsmrnzxjhzjhxgzyixahjuemvdrzxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804862.1398106-363-210522297443627/AnsiballZ_file.py'
Nov 22 09:47:43 compute-0 sudo[91123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:43 compute-0 python3.9[91125]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:43 compute-0 sudo[91123]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:43 compute-0 sudo[91275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tahtbkqhctuawrmxqlzldpyijyxndotw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804863.4744704-386-134988272882264/AnsiballZ_file.py'
Nov 22 09:47:43 compute-0 sudo[91275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:43 compute-0 python3.9[91277]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:44 compute-0 sudo[91275]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:44 compute-0 sudo[91427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbppquowyygmdzlrxytjggxsrzisnqwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804864.2153358-394-154181396529302/AnsiballZ_stat.py'
Nov 22 09:47:44 compute-0 sudo[91427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:44 compute-0 python3.9[91429]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:44 compute-0 sudo[91427]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:44 compute-0 sudo[91505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnngrelknrlxwrjrnmkbnwmftxvatqol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804864.2153358-394-154181396529302/AnsiballZ_file.py'
Nov 22 09:47:45 compute-0 sudo[91505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:45 compute-0 python3.9[91507]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:45 compute-0 sudo[91505]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:45 compute-0 sudo[91657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abowdgtsfdyrtdtamgxqquzluwrtfbgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804865.4043646-406-31092436677412/AnsiballZ_stat.py'
Nov 22 09:47:45 compute-0 sudo[91657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:45 compute-0 python3.9[91659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:46 compute-0 sudo[91657]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:46 compute-0 sudo[91735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdaitycyjshdvvhobkmaloxmqparfvvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804865.4043646-406-31092436677412/AnsiballZ_file.py'
Nov 22 09:47:46 compute-0 sudo[91735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:46 compute-0 python3.9[91737]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:46 compute-0 sudo[91735]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:47 compute-0 sudo[91887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ospcppbjgaulndhvzvwopoqfqpwndwlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804866.680921-418-255323865902400/AnsiballZ_systemd.py'
Nov 22 09:47:47 compute-0 sudo[91887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:47 compute-0 python3.9[91889]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:47:47 compute-0 systemd[1]: Reloading.
Nov 22 09:47:47 compute-0 systemd-sysv-generator[91922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:47:47 compute-0 systemd-rc-local-generator[91918]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:47:47 compute-0 sudo[91887]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:48 compute-0 sudo[92077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbhaudsagulmooxebhdohmrooupidglw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804867.8575053-426-119421657380494/AnsiballZ_stat.py'
Nov 22 09:47:48 compute-0 sudo[92077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:48 compute-0 python3.9[92079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:48 compute-0 sudo[92077]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:48 compute-0 sudo[92155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqownotykkhflvrzevvgnwpoxykjlvyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804867.8575053-426-119421657380494/AnsiballZ_file.py'
Nov 22 09:47:48 compute-0 sudo[92155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:48 compute-0 python3.9[92157]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:48 compute-0 sudo[92155]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:49 compute-0 sudo[92307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvzxooqfnmgdarsexiybdoqtbzuzzctn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804869.0863864-438-270160453937965/AnsiballZ_stat.py'
Nov 22 09:47:49 compute-0 sudo[92307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:49 compute-0 python3.9[92309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:49 compute-0 sudo[92307]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:49 compute-0 sudo[92385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoxsijjtoobxudqyvcdvjspebhopacze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804869.0863864-438-270160453937965/AnsiballZ_file.py'
Nov 22 09:47:49 compute-0 sudo[92385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:50 compute-0 python3.9[92387]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:50 compute-0 sudo[92385]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:50 compute-0 sudo[92537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqqjvqpdrnyxvfthwbjddtuxcfxawfsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804870.3887024-450-38078358611766/AnsiballZ_systemd.py'
Nov 22 09:47:50 compute-0 sudo[92537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:51 compute-0 python3.9[92539]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:47:51 compute-0 systemd[1]: Reloading.
Nov 22 09:47:51 compute-0 systemd-rc-local-generator[92567]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:47:51 compute-0 systemd-sysv-generator[92571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:47:51 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 09:47:51 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 09:47:51 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 09:47:51 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 09:47:51 compute-0 sudo[92537]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:52 compute-0 sudo[92731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdphypymbwrpxqgoqfbfepfppcqwlmbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804871.7853734-460-13010972296597/AnsiballZ_file.py'
Nov 22 09:47:52 compute-0 sudo[92731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:52 compute-0 python3.9[92733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:52 compute-0 sudo[92731]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:52 compute-0 sudo[92883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvnrirfzvddekajjfhnllnpfinpnveyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804872.598125-468-62005427029633/AnsiballZ_stat.py'
Nov 22 09:47:52 compute-0 sudo[92883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:53 compute-0 python3.9[92885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:53 compute-0 sudo[92883]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:53 compute-0 sudo[93006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgubrqbdyztnuhdtppxxwgcvdyfcwgyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804872.598125-468-62005427029633/AnsiballZ_copy.py'
Nov 22 09:47:53 compute-0 sudo[93006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:53 compute-0 python3.9[93008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804872.598125-468-62005427029633/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:53 compute-0 sudo[93006]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:54 compute-0 sudo[93158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgboslpnnwerfzogjtrwdmwoupfjcjdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804874.3002295-485-217605551016460/AnsiballZ_file.py'
Nov 22 09:47:54 compute-0 sudo[93158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:54 compute-0 python3.9[93160]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:47:54 compute-0 sudo[93158]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:55 compute-0 sudo[93310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkmaqqsefyhozroeyxxdrriwoztbebbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804875.0655205-493-82040426041982/AnsiballZ_stat.py'
Nov 22 09:47:55 compute-0 sudo[93310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:55 compute-0 python3.9[93312]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:47:55 compute-0 sudo[93310]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:55 compute-0 sudo[93433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwubmwzchvoskbozueidjbkyfxlvfrpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804875.0655205-493-82040426041982/AnsiballZ_copy.py'
Nov 22 09:47:55 compute-0 sudo[93433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:56 compute-0 python3.9[93435]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804875.0655205-493-82040426041982/.source.json _original_basename=.efjpzfkd follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:56 compute-0 sudo[93433]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:56 compute-0 sudo[93585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbdoojqsnouqntebyshtvxpdegneybyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804876.369502-508-1548465462197/AnsiballZ_file.py'
Nov 22 09:47:56 compute-0 sudo[93585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:56 compute-0 python3.9[93587]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:47:56 compute-0 sudo[93585]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:57 compute-0 sudo[93737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdlamohveufdibrgtghndmasynkzkdca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804877.213847-516-271165380523099/AnsiballZ_stat.py'
Nov 22 09:47:57 compute-0 sudo[93737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:57 compute-0 sudo[93737]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:58 compute-0 sudo[93861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xstsdklsbbzxxoethjvgtgqcjfuilpte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804877.213847-516-271165380523099/AnsiballZ_copy.py'
Nov 22 09:47:58 compute-0 sudo[93861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:58 compute-0 sudo[93861]: pam_unix(sudo:session): session closed for user root
Nov 22 09:47:59 compute-0 sudo[94013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wufglhvhprauqaqejvemfefjyzrjzcmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804878.7218602-533-124404675245306/AnsiballZ_container_config_data.py'
Nov 22 09:47:59 compute-0 sudo[94013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:47:59 compute-0 python3.9[94015]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 22 09:47:59 compute-0 sudo[94013]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:00 compute-0 sudo[94165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noseqmqycscafzjeszzfnjpdhntzekxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804879.6616921-542-277663089920253/AnsiballZ_container_config_hash.py'
Nov 22 09:48:00 compute-0 sudo[94165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:00 compute-0 python3.9[94167]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:48:00 compute-0 sudo[94165]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:01 compute-0 sudo[94317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfsdmteyeumutqyntzexvfusvhqigouf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804880.7307556-551-190576540840716/AnsiballZ_podman_container_info.py'
Nov 22 09:48:01 compute-0 sudo[94317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:01 compute-0 python3.9[94319]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 09:48:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:48:01 compute-0 sudo[94317]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:02 compute-0 sudo[94479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrzesujkbzqwljqmkikudvbqqozhjhib ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763804882.1485922-564-130051059271813/AnsiballZ_edpm_container_manage.py'
Nov 22 09:48:02 compute-0 sudo[94479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:02 compute-0 python3[94481]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:48:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:48:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:48:03 compute-0 podman[94516]: 2025-11-22 09:48:03.209034568 +0000 UTC m=+0.092365840 container create e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 22 09:48:03 compute-0 podman[94516]: 2025-11-22 09:48:03.155889754 +0000 UTC m=+0.039221066 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 09:48:03 compute-0 python3[94481]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 09:48:03 compute-0 sudo[94479]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:03 compute-0 sudo[94703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgdjcstvsldfplhlwrbwavgdxbkkzlup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804883.65461-572-126810513705327/AnsiballZ_stat.py'
Nov 22 09:48:03 compute-0 sudo[94703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 09:48:04 compute-0 python3.9[94705]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:48:04 compute-0 sudo[94703]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:04 compute-0 sudo[94857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogjjgqjcddziwhukqmqksiilsfpgmwit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804884.4951973-581-219466124694286/AnsiballZ_file.py'
Nov 22 09:48:04 compute-0 sudo[94857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:04 compute-0 python3.9[94859]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:48:04 compute-0 sudo[94857]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:05 compute-0 sudo[94933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccgxmnbivvqamyowovvmtnxooxvnlpfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804884.4951973-581-219466124694286/AnsiballZ_stat.py'
Nov 22 09:48:05 compute-0 sudo[94933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:05 compute-0 python3.9[94935]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:48:05 compute-0 sudo[94933]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:06 compute-0 sudo[95084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjufrloiasjyenwbkfhnbqasbdefseiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804885.4902687-581-269979779876183/AnsiballZ_copy.py'
Nov 22 09:48:06 compute-0 sudo[95084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:06 compute-0 python3.9[95086]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763804885.4902687-581-269979779876183/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:48:06 compute-0 sudo[95084]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:06 compute-0 sudo[95160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmddbnzlvazklonezxispporkyxbsbkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804885.4902687-581-269979779876183/AnsiballZ_systemd.py'
Nov 22 09:48:06 compute-0 sudo[95160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:06 compute-0 python3.9[95162]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:48:06 compute-0 systemd[1]: Reloading.
Nov 22 09:48:06 compute-0 systemd-rc-local-generator[95189]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:48:06 compute-0 systemd-sysv-generator[95193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:48:07 compute-0 sudo[95160]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:07 compute-0 sudo[95270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znbzbfwgwyrogrorpjdyahpukkbhbwiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804885.4902687-581-269979779876183/AnsiballZ_systemd.py'
Nov 22 09:48:07 compute-0 sudo[95270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:07 compute-0 python3.9[95272]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:48:07 compute-0 systemd[1]: Reloading.
Nov 22 09:48:07 compute-0 systemd-rc-local-generator[95303]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:48:07 compute-0 systemd-sysv-generator[95306]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:48:08 compute-0 systemd[1]: Starting ovn_controller container...
Nov 22 09:48:08 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 22 09:48:08 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:48:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b3ab10994b20c637c93d36606bc06b2040f0411bb53540b8bd1dff30749b03b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 09:48:08 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e.
Nov 22 09:48:08 compute-0 podman[95314]: 2025-11-22 09:48:08.313847131 +0000 UTC m=+0.160141802 container init e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + sudo -E kolla_set_configs
Nov 22 09:48:08 compute-0 podman[95314]: 2025-11-22 09:48:08.354510577 +0000 UTC m=+0.200805138 container start e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 09:48:08 compute-0 edpm-start-podman-container[95314]: ovn_controller
Nov 22 09:48:08 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 22 09:48:08 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 22 09:48:08 compute-0 podman[95336]: 2025-11-22 09:48:08.414181203 +0000 UTC m=+0.050002277 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 09:48:08 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 22 09:48:08 compute-0 edpm-start-podman-container[95313]: Creating additional drop-in dependency for "ovn_controller" (e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e)
Nov 22 09:48:08 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 22 09:48:08 compute-0 systemd[1]: e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e-5e49371886dc53a3.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 09:48:08 compute-0 systemd[1]: e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e-5e49371886dc53a3.service: Failed with result 'exit-code'.
Nov 22 09:48:08 compute-0 systemd[95371]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 22 09:48:08 compute-0 systemd[1]: Reloading.
Nov 22 09:48:08 compute-0 systemd-rc-local-generator[95416]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:48:08 compute-0 systemd-sysv-generator[95419]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:48:08 compute-0 systemd[95371]: Queued start job for default target Main User Target.
Nov 22 09:48:08 compute-0 systemd[95371]: Created slice User Application Slice.
Nov 22 09:48:08 compute-0 systemd[95371]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 22 09:48:08 compute-0 systemd[95371]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 09:48:08 compute-0 systemd[95371]: Reached target Paths.
Nov 22 09:48:08 compute-0 systemd[95371]: Reached target Timers.
Nov 22 09:48:08 compute-0 systemd[95371]: Starting D-Bus User Message Bus Socket...
Nov 22 09:48:08 compute-0 systemd[95371]: Starting Create User's Volatile Files and Directories...
Nov 22 09:48:08 compute-0 systemd[95371]: Listening on D-Bus User Message Bus Socket.
Nov 22 09:48:08 compute-0 systemd[95371]: Reached target Sockets.
Nov 22 09:48:08 compute-0 systemd[95371]: Finished Create User's Volatile Files and Directories.
Nov 22 09:48:08 compute-0 systemd[95371]: Reached target Basic System.
Nov 22 09:48:08 compute-0 systemd[95371]: Reached target Main User Target.
Nov 22 09:48:08 compute-0 systemd[95371]: Startup finished in 133ms.
Nov 22 09:48:08 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 22 09:48:08 compute-0 systemd[1]: Started ovn_controller container.
Nov 22 09:48:08 compute-0 systemd[1]: Started Session c1 of User root.
Nov 22 09:48:08 compute-0 sudo[95270]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:08 compute-0 ovn_controller[95329]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:48:08 compute-0 ovn_controller[95329]: INFO:__main__:Validating config file
Nov 22 09:48:08 compute-0 ovn_controller[95329]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:48:08 compute-0 ovn_controller[95329]: INFO:__main__:Writing out command to execute
Nov 22 09:48:08 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 22 09:48:08 compute-0 ovn_controller[95329]: ++ cat /run_command
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + ARGS=
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + sudo kolla_copy_cacerts
Nov 22 09:48:08 compute-0 systemd[1]: Started Session c2 of User root.
Nov 22 09:48:08 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + [[ ! -n '' ]]
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + . kolla_extend_start
Nov 22 09:48:08 compute-0 ovn_controller[95329]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + umask 0022
Nov 22 09:48:08 compute-0 ovn_controller[95329]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.8625] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.8635] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.8653] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.8661] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.8666] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 09:48:08 compute-0 kernel: br-int: entered promiscuous mode
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00011|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00012|features|INFO|OVS Feature: ct_flush, state: supported
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00013|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00014|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00015|main|INFO|OVS feature set changed, force recompute.
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00016|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 22 09:48:08 compute-0 systemd-udevd[95483]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 09:48:08 compute-0 ovn_controller[95329]: 2025-11-22T09:48:08Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.9086] manager: (ovn-346a8e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 22 09:48:08 compute-0 systemd-udevd[95488]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 09:48:08 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.9347] device (genev_sys_6081): carrier: link connected
Nov 22 09:48:08 compute-0 NetworkManager[55425]: <info>  [1763804888.9351] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 22 09:48:09 compute-0 sudo[95590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbvhxkjajbkukylglsacesxkhdnujntt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804888.8921695-609-17233846822260/AnsiballZ_command.py'
Nov 22 09:48:09 compute-0 sudo[95590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:09 compute-0 python3.9[95592]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:48:09 compute-0 ovs-vsctl[95593]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 22 09:48:09 compute-0 sudo[95590]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:09 compute-0 sudo[95743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-supbdunqperhwdksmahrgivxbgnoyrkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804889.6562636-617-58399296819068/AnsiballZ_command.py'
Nov 22 09:48:09 compute-0 sudo[95743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:10 compute-0 python3.9[95745]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:48:10 compute-0 ovs-vsctl[95747]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 22 09:48:10 compute-0 sudo[95743]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:10 compute-0 sudo[95898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqgmmcocklosgtkngnopbtrwhjpwdarq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804890.6257372-631-65738822304745/AnsiballZ_command.py'
Nov 22 09:48:10 compute-0 sudo[95898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:11 compute-0 python3.9[95900]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:48:11 compute-0 ovs-vsctl[95901]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 22 09:48:11 compute-0 sudo[95898]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:11 compute-0 sshd-session[84836]: Connection closed by 192.168.122.30 port 51850
Nov 22 09:48:11 compute-0 sshd-session[84833]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:48:11 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 22 09:48:11 compute-0 systemd[1]: session-19.scope: Consumed 49.081s CPU time.
Nov 22 09:48:11 compute-0 systemd-logind[819]: Session 19 logged out. Waiting for processes to exit.
Nov 22 09:48:11 compute-0 systemd-logind[819]: Removed session 19.
Nov 22 09:48:16 compute-0 sshd-session[95926]: Accepted publickey for zuul from 192.168.122.30 port 40088 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:48:16 compute-0 systemd-logind[819]: New session 21 of user zuul.
Nov 22 09:48:16 compute-0 systemd[1]: Started Session 21 of User zuul.
Nov 22 09:48:17 compute-0 sshd-session[95926]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:48:18 compute-0 python3.9[96079]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:48:19 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 22 09:48:19 compute-0 systemd[95371]: Activating special unit Exit the Session...
Nov 22 09:48:19 compute-0 systemd[95371]: Stopped target Main User Target.
Nov 22 09:48:19 compute-0 systemd[95371]: Stopped target Basic System.
Nov 22 09:48:19 compute-0 systemd[95371]: Stopped target Paths.
Nov 22 09:48:19 compute-0 systemd[95371]: Stopped target Sockets.
Nov 22 09:48:19 compute-0 systemd[95371]: Stopped target Timers.
Nov 22 09:48:19 compute-0 systemd[95371]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 09:48:19 compute-0 systemd[95371]: Closed D-Bus User Message Bus Socket.
Nov 22 09:48:19 compute-0 systemd[95371]: Stopped Create User's Volatile Files and Directories.
Nov 22 09:48:19 compute-0 systemd[95371]: Removed slice User Application Slice.
Nov 22 09:48:19 compute-0 systemd[95371]: Reached target Shutdown.
Nov 22 09:48:19 compute-0 systemd[95371]: Finished Exit the Session.
Nov 22 09:48:19 compute-0 systemd[95371]: Reached target Exit the Session.
Nov 22 09:48:19 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 22 09:48:19 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 22 09:48:19 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 22 09:48:19 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 22 09:48:19 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 22 09:48:19 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 22 09:48:19 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 22 09:48:19 compute-0 sudo[96236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iitngpuohrvjprdbjnzaboyztyxazakm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804898.6183133-34-135270910961741/AnsiballZ_file.py'
Nov 22 09:48:19 compute-0 sudo[96236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:19 compute-0 python3.9[96239]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:19 compute-0 sudo[96236]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:19 compute-0 sudo[96389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kctxxihddztqbqqsmmrehbevqzfykurr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804899.4527192-34-58214528140314/AnsiballZ_file.py'
Nov 22 09:48:19 compute-0 sudo[96389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:19 compute-0 python3.9[96391]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:19 compute-0 sudo[96389]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:20 compute-0 sudo[96541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnttjlrckgloeekxmvynuqzyrchghrpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804900.0803032-34-126346448158057/AnsiballZ_file.py'
Nov 22 09:48:20 compute-0 sudo[96541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:20 compute-0 python3.9[96543]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:20 compute-0 sudo[96541]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:21 compute-0 sudo[96693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdaoberygkzisgfmwhtvaqtozknvmsal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804900.7721095-34-151016665479773/AnsiballZ_file.py'
Nov 22 09:48:21 compute-0 sudo[96693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:21 compute-0 python3.9[96695]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:21 compute-0 sudo[96693]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:21 compute-0 sudo[96845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usxdavpqyrkjzijqdwkgvekuuvliwfeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804901.503413-34-209688725640791/AnsiballZ_file.py'
Nov 22 09:48:21 compute-0 sudo[96845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:22 compute-0 python3.9[96847]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:22 compute-0 sudo[96845]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:23 compute-0 python3.9[96997]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:48:23 compute-0 sudo[97147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxuhqcktqrvuzzlfyqvmnlfbddjybnrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804903.3982399-78-13135182997697/AnsiballZ_seboolean.py'
Nov 22 09:48:23 compute-0 sudo[97147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:24 compute-0 python3.9[97149]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 09:48:24 compute-0 sudo[97147]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:25 compute-0 python3.9[97299]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:26 compute-0 python3.9[97420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804904.9576604-86-94399673026928/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:27 compute-0 python3.9[97571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:27 compute-0 python3.9[97692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804906.7257109-101-236510758059635/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:28 compute-0 sudo[97842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxboeoswtvbwokdsezrbojjmpvzrfcyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804908.4214215-118-172986259756694/AnsiballZ_setup.py'
Nov 22 09:48:28 compute-0 sudo[97842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:29 compute-0 python3.9[97844]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:48:29 compute-0 sudo[97842]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:29 compute-0 sudo[97926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amqxnjyohrejepifhfilrynxeisrqizw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804908.4214215-118-172986259756694/AnsiballZ_dnf.py'
Nov 22 09:48:29 compute-0 sudo[97926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:30 compute-0 python3.9[97928]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:48:31 compute-0 sudo[97926]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:32 compute-0 sudo[98079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lekkzrqiicylncjhwcqpwbvywrexqecm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804911.6264656-130-116454149213385/AnsiballZ_systemd.py'
Nov 22 09:48:32 compute-0 sudo[98079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:32 compute-0 python3.9[98081]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:48:32 compute-0 sudo[98079]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:33 compute-0 python3.9[98234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:34 compute-0 python3.9[98355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804912.8434649-138-168435337353172/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:34 compute-0 python3.9[98505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:35 compute-0 python3.9[98626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804914.268565-138-199277259148094/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:36 compute-0 python3.9[98776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:37 compute-0 python3.9[98897]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804916.1075003-182-250569540736742/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:37 compute-0 python3.9[99047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:38 compute-0 python3.9[99168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804917.3157878-182-180856491150359/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:38 compute-0 ovn_controller[95329]: 2025-11-22T09:48:38Z|00025|memory|INFO|16256 kB peak resident set size after 29.8 seconds
Nov 22 09:48:38 compute-0 ovn_controller[95329]: 2025-11-22T09:48:38Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Nov 22 09:48:38 compute-0 podman[99169]: 2025-11-22 09:48:38.667379935 +0000 UTC m=+0.109774047 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:48:39 compute-0 python3.9[99344]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:48:39 compute-0 sudo[99496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwwgmocymlxermcculhfmodnahtvnlny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804919.50246-220-203417256008379/AnsiballZ_file.py'
Nov 22 09:48:39 compute-0 sudo[99496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:40 compute-0 python3.9[99498]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:40 compute-0 sudo[99496]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:40 compute-0 sudo[99648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxzpxjcoqyajwfcocjzctpkxxyajdjjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804920.33271-228-67423197313350/AnsiballZ_stat.py'
Nov 22 09:48:40 compute-0 sudo[99648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:40 compute-0 python3.9[99650]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:40 compute-0 sudo[99648]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:41 compute-0 sudo[99726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwslrxqvjjxmopokewjaryuradtuzeir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804920.33271-228-67423197313350/AnsiballZ_file.py'
Nov 22 09:48:41 compute-0 sudo[99726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:41 compute-0 python3.9[99728]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:41 compute-0 sudo[99726]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:42 compute-0 sudo[99878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkdjozyvkfslgxhqsvpxcinmsgekybrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804921.701716-228-156057578524614/AnsiballZ_stat.py'
Nov 22 09:48:42 compute-0 sudo[99878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:42 compute-0 python3.9[99880]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:42 compute-0 sudo[99878]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:42 compute-0 sudo[99956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivxuvcoyykljqzezwetebhcgmpructqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804921.701716-228-156057578524614/AnsiballZ_file.py'
Nov 22 09:48:42 compute-0 sudo[99956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:42 compute-0 python3.9[99958]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:48:42 compute-0 sudo[99956]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:43 compute-0 sudo[100108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkyispsvlibpnuhlypaswfgsuktyrrhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804922.8537812-251-251361970665397/AnsiballZ_file.py'
Nov 22 09:48:43 compute-0 sudo[100108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:43 compute-0 python3.9[100110]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:48:43 compute-0 sudo[100108]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:43 compute-0 sudo[100260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaaztcrxkgdbgnpwaovwmjlascvzsnbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804923.5491636-259-129175528238896/AnsiballZ_stat.py'
Nov 22 09:48:43 compute-0 sudo[100260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:44 compute-0 python3.9[100262]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:44 compute-0 sudo[100260]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:44 compute-0 sudo[100338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmdkpjzujtxtqmvyzpjzpufdeumzdhcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804923.5491636-259-129175528238896/AnsiballZ_file.py'
Nov 22 09:48:44 compute-0 sudo[100338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:44 compute-0 python3.9[100340]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:48:44 compute-0 sudo[100338]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:44 compute-0 sudo[100490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdpyfowqlqfjlusywfhhouxteubendyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804924.676286-271-27444091199765/AnsiballZ_stat.py'
Nov 22 09:48:44 compute-0 sudo[100490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:45 compute-0 python3.9[100492]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:45 compute-0 sudo[100490]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:45 compute-0 sudo[100568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gunefzdpodjlpqsamozwhzamfkocoadi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804924.676286-271-27444091199765/AnsiballZ_file.py'
Nov 22 09:48:45 compute-0 sudo[100568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:45 compute-0 python3.9[100570]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:48:45 compute-0 sudo[100568]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:46 compute-0 sudo[100720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpdxfroygetvghpcukdomcebvrpfozh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804925.9283133-283-274953183581226/AnsiballZ_systemd.py'
Nov 22 09:48:46 compute-0 sudo[100720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:46 compute-0 python3.9[100722]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:48:46 compute-0 systemd[1]: Reloading.
Nov 22 09:48:46 compute-0 systemd-rc-local-generator[100748]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:48:46 compute-0 systemd-sysv-generator[100751]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:48:46 compute-0 sudo[100720]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:47 compute-0 sudo[100909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxkisjbyjpeiocgrryesmrwqojejpzug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804927.057096-291-225826132825429/AnsiballZ_stat.py'
Nov 22 09:48:47 compute-0 sudo[100909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:47 compute-0 python3.9[100911]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:47 compute-0 sudo[100909]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:47 compute-0 sudo[100987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iakxztubxdcooqkcrhjbdsgywgshfrjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804927.057096-291-225826132825429/AnsiballZ_file.py'
Nov 22 09:48:47 compute-0 sudo[100987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:57 compute-0 python3.9[100989]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:48:57 compute-0 sudo[100987]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:58 compute-0 sudo[101139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmirplqhpaidvquoaxgxhawbcjefqzzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804938.1031437-303-85363183305973/AnsiballZ_stat.py'
Nov 22 09:48:58 compute-0 sudo[101139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:58 compute-0 python3.9[101141]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:48:58 compute-0 sudo[101139]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:58 compute-0 sudo[101217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyomxbucggsflhaadpmumyoyyysvnrzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804938.1031437-303-85363183305973/AnsiballZ_file.py'
Nov 22 09:48:58 compute-0 sudo[101217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:59 compute-0 python3.9[101219]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:48:59 compute-0 sudo[101217]: pam_unix(sudo:session): session closed for user root
Nov 22 09:48:59 compute-0 sudo[101369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfqrecmxmlhyywboomgzpdjumywwlvba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804939.3434763-315-87222972999257/AnsiballZ_systemd.py'
Nov 22 09:48:59 compute-0 sudo[101369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:48:59 compute-0 python3.9[101371]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:00 compute-0 systemd[1]: Reloading.
Nov 22 09:49:00 compute-0 systemd-sysv-generator[101403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:49:00 compute-0 systemd-rc-local-generator[101398]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:49:00 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 09:49:00 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 09:49:00 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 09:49:00 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 09:49:00 compute-0 sudo[101369]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:01 compute-0 sudo[101564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eltvloxrgtuvyjwbgpkfdozkfpmpjzhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804940.7301757-325-205587552147092/AnsiballZ_file.py'
Nov 22 09:49:01 compute-0 sudo[101564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:01 compute-0 python3.9[101566]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:49:01 compute-0 sudo[101564]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:01 compute-0 sudo[101716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chauarzcjoaehmltpadllatetmlsqrmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804941.493766-333-212936128954156/AnsiballZ_stat.py'
Nov 22 09:49:01 compute-0 sudo[101716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:02 compute-0 python3.9[101718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:49:02 compute-0 sudo[101716]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:02 compute-0 sudo[101839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktncewzbdxzabmbrbblfcomylbmqxvtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804941.493766-333-212936128954156/AnsiballZ_copy.py'
Nov 22 09:49:02 compute-0 sudo[101839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:02 compute-0 python3.9[101841]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763804941.493766-333-212936128954156/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:49:02 compute-0 sudo[101839]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:03 compute-0 sudo[101991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juklfetmwophzbweivaqrkyqdwxjtmvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804943.1774356-350-281037620988023/AnsiballZ_file.py'
Nov 22 09:49:03 compute-0 sudo[101991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:03 compute-0 python3.9[101993]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:49:03 compute-0 sudo[101991]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:04 compute-0 sudo[102143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipofflvcijtqsjyysdoyueoopyufsejr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804943.9561534-358-65675714655262/AnsiballZ_stat.py'
Nov 22 09:49:04 compute-0 sudo[102143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:04 compute-0 python3.9[102145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:49:04 compute-0 sudo[102143]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:04 compute-0 sudo[102266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxiihwnfpmlfvjzldmxdpeoiccnnmkuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804943.9561534-358-65675714655262/AnsiballZ_copy.py'
Nov 22 09:49:04 compute-0 sudo[102266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:04 compute-0 python3.9[102268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763804943.9561534-358-65675714655262/.source.json _original_basename=.lbn79fve follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:05 compute-0 sudo[102266]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:05 compute-0 sudo[102418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrlbhyhjoaqwobhspqkamtryiktjxlmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804945.1746385-373-91264495013435/AnsiballZ_file.py'
Nov 22 09:49:05 compute-0 sudo[102418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:05 compute-0 python3.9[102420]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:05 compute-0 sudo[102418]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:06 compute-0 sudo[102570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abqldntuldrujnixwbldnvfflcrbneuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804945.8959856-381-87150907432150/AnsiballZ_stat.py'
Nov 22 09:49:06 compute-0 sudo[102570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:06 compute-0 sudo[102570]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:06 compute-0 sudo[102693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpqcetiwcksfzvsjrfurrxcmrxkgmgny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804945.8959856-381-87150907432150/AnsiballZ_copy.py'
Nov 22 09:49:06 compute-0 sudo[102693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:06 compute-0 sudo[102693]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:07 compute-0 sudo[102845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egdvqqdgxajvxgftqplsbecpskievemp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804947.2688737-398-84039815458985/AnsiballZ_container_config_data.py'
Nov 22 09:49:07 compute-0 sudo[102845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:07 compute-0 python3.9[102847]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 22 09:49:07 compute-0 sudo[102845]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:08 compute-0 sudo[102997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqyalexqymuoymetlivhnklvkpzyiqqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804948.1610663-407-41415445795116/AnsiballZ_container_config_hash.py'
Nov 22 09:49:08 compute-0 sudo[102997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:08 compute-0 python3.9[102999]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:49:08 compute-0 sudo[102997]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:09 compute-0 sudo[103163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbzgctobizvfwkilxinsbcxhnxvyobhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804949.1207376-416-227273360587936/AnsiballZ_podman_container_info.py'
Nov 22 09:49:09 compute-0 sudo[103163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:09 compute-0 podman[103123]: 2025-11-22 09:49:09.634598412 +0000 UTC m=+0.114185309 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 22 09:49:09 compute-0 python3.9[103171]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 09:49:09 compute-0 sudo[103163]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:10 compute-0 sudo[103355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhnavzmtlpjweiotrjpilrykdyzxnfrk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763804950.373755-429-184263601452010/AnsiballZ_edpm_container_manage.py'
Nov 22 09:49:10 compute-0 sudo[103355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:11 compute-0 python3[103357]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:49:11 compute-0 podman[103394]: 2025-11-22 09:49:11.464743779 +0000 UTC m=+0.053137052 container create 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 09:49:11 compute-0 podman[103394]: 2025-11-22 09:49:11.436002765 +0000 UTC m=+0.024396028 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 09:49:11 compute-0 python3[103357]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 09:49:11 compute-0 sudo[103355]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:12 compute-0 sudo[103583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isvdnonztekusvvvojcpvpsnxayobfqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804951.7986271-437-100046283096953/AnsiballZ_stat.py'
Nov 22 09:49:12 compute-0 sudo[103583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:12 compute-0 python3.9[103585]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:49:12 compute-0 sudo[103583]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:12 compute-0 sudo[103737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucdoxcwbzqptjkndjqmilnmtkpmpuzth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804952.5554178-446-8576388447555/AnsiballZ_file.py'
Nov 22 09:49:12 compute-0 sudo[103737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:13 compute-0 python3.9[103739]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:13 compute-0 sudo[103737]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:13 compute-0 sudo[103813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjtztneiwjwoaqahyfclmcemzeypzuje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804952.5554178-446-8576388447555/AnsiballZ_stat.py'
Nov 22 09:49:13 compute-0 sudo[103813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:13 compute-0 python3.9[103815]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:49:13 compute-0 sudo[103813]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:14 compute-0 sudo[103964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjdkpehqchbvzhoyasifgfnhcymppnmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804953.5849187-446-36656384526805/AnsiballZ_copy.py'
Nov 22 09:49:14 compute-0 sudo[103964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:14 compute-0 python3.9[103966]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763804953.5849187-446-36656384526805/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:14 compute-0 sudo[103964]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:14 compute-0 sudo[104040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jytdjrlhtquxntwzxnsgrgpilvqbuqxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804953.5849187-446-36656384526805/AnsiballZ_systemd.py'
Nov 22 09:49:14 compute-0 sudo[104040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:14 compute-0 python3.9[104042]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:49:14 compute-0 systemd[1]: Reloading.
Nov 22 09:49:14 compute-0 systemd-rc-local-generator[104068]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:49:14 compute-0 systemd-sysv-generator[104073]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:49:15 compute-0 sudo[104040]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:15 compute-0 sudo[104151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elldosmuptupktwsgdzodorepmrafzqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804953.5849187-446-36656384526805/AnsiballZ_systemd.py'
Nov 22 09:49:15 compute-0 sudo[104151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:15 compute-0 python3.9[104153]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:15 compute-0 systemd[1]: Reloading.
Nov 22 09:49:15 compute-0 systemd-rc-local-generator[104185]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:49:15 compute-0 systemd-sysv-generator[104190]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:49:15 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 22 09:49:15 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:49:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4a5d0bcc2cd704014f0ee10e3cf91d81cea47b4f2ccccaa1899f55ef2ad9224/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 22 09:49:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4a5d0bcc2cd704014f0ee10e3cf91d81cea47b4f2ccccaa1899f55ef2ad9224/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 09:49:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f.
Nov 22 09:49:16 compute-0 podman[104195]: 2025-11-22 09:49:16.032761933 +0000 UTC m=+0.122666341 container init 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + sudo -E kolla_set_configs
Nov 22 09:49:16 compute-0 podman[104195]: 2025-11-22 09:49:16.065684892 +0000 UTC m=+0.155589270 container start 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:49:16 compute-0 edpm-start-podman-container[104195]: ovn_metadata_agent
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Validating config file
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Copying service configuration files
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Writing out command to execute
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 22 09:49:16 compute-0 podman[104217]: 2025-11-22 09:49:16.142995123 +0000 UTC m=+0.067992848 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:49:16 compute-0 edpm-start-podman-container[104194]: Creating additional drop-in dependency for "ovn_metadata_agent" (6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f)
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: ++ cat /run_command
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + CMD=neutron-ovn-metadata-agent
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + ARGS=
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + sudo kolla_copy_cacerts
Nov 22 09:49:16 compute-0 systemd[1]: Reloading.
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + [[ ! -n '' ]]
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + . kolla_extend_start
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: Running command: 'neutron-ovn-metadata-agent'
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + umask 0022
Nov 22 09:49:16 compute-0 ovn_metadata_agent[104211]: + exec neutron-ovn-metadata-agent
Nov 22 09:49:16 compute-0 systemd-sysv-generator[104294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:49:16 compute-0 systemd-rc-local-generator[104290]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:49:16 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 22 09:49:16 compute-0 sudo[104151]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:16 compute-0 sshd-session[95929]: Connection closed by 192.168.122.30 port 40088
Nov 22 09:49:16 compute-0 sshd-session[95926]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:49:16 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Nov 22 09:49:16 compute-0 systemd[1]: session-21.scope: Consumed 35.998s CPU time.
Nov 22 09:49:16 compute-0 systemd-logind[819]: Session 21 logged out. Waiting for processes to exit.
Nov 22 09:49:16 compute-0 systemd-logind[819]: Removed session 21.
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.878 104216 INFO neutron.common.config [-] Logging enabled!
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.879 104216 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.879 104216 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.879 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.879 104216 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.879 104216 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.880 104216 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.881 104216 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.882 104216 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.883 104216 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.884 104216 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.885 104216 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.886 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.887 104216 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.888 104216 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.889 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.890 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.891 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.892 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.893 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.894 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.895 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.896 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.897 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.898 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.899 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.900 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.901 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.902 104216 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.903 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.904 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.905 104216 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.906 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.907 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.908 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.909 104216 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.918 104216 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.918 104216 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.918 104216 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.918 104216 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.918 104216 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.930 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name f6533837-2723-4772-a9db-3c9eeea0db5c (UUID: f6533837-2723-4772-a9db-3c9eeea0db5c) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.960 104216 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.960 104216 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.960 104216 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.960 104216 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.963 104216 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.970 104216 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.977 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'f6533837-2723-4772-a9db-3c9eeea0db5c'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], external_ids={}, name=f6533837-2723-4772-a9db-3c9eeea0db5c, nb_cfg_timestamp=1763804896888, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.978 104216 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f6649217dc0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.978 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.978 104216 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.979 104216 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.979 104216 INFO oslo_service.service [-] Starting 1 workers
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.983 104216 DEBUG oslo_service.service [-] Started child 104324 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.985 104324 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-428020'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 22 09:49:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:17.986 104216 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpf8c21qeb/privsep.sock']
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.008 104324 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.009 104324 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.009 104324 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.012 104324 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.019 104324 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.027 104324 INFO eventlet.wsgi.server [-] (104324) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 22 09:49:18 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.695 104216 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.696 104216 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpf8c21qeb/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.562 104329 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.570 104329 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.574 104329 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.575 104329 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104329
Nov 22 09:49:18 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:18.701 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[74f6fcba-7b63-4601-86e3-5a2d53b9bcc9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.176 104329 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.176 104329 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.176 104329 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.693 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[91425975-1bd8-455c-a75b-0c70ca00dbaf]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.696 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, column=external_ids, values=({'neutron:ovn-metadata-id': 'b82e4d1d-d9d3-5217-8028-d3787207409a'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.710 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.719 104216 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.719 104216 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.719 104216 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.719 104216 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.719 104216 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.720 104216 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.720 104216 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.720 104216 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.721 104216 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.721 104216 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.721 104216 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.722 104216 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.722 104216 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.722 104216 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.723 104216 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.723 104216 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.724 104216 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.724 104216 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.724 104216 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.725 104216 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.725 104216 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.725 104216 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.726 104216 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.726 104216 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.727 104216 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.727 104216 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.727 104216 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.728 104216 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.728 104216 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.728 104216 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.729 104216 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.729 104216 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.729 104216 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.730 104216 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.730 104216 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.730 104216 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.731 104216 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.731 104216 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.732 104216 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.732 104216 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.732 104216 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.733 104216 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.733 104216 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.733 104216 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.734 104216 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.734 104216 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.734 104216 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.735 104216 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.735 104216 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.735 104216 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.736 104216 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.736 104216 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.736 104216 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.736 104216 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.737 104216 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.737 104216 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.737 104216 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.738 104216 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.738 104216 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.738 104216 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.739 104216 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.739 104216 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.739 104216 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.740 104216 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.740 104216 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.741 104216 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.741 104216 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.742 104216 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.742 104216 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.742 104216 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.743 104216 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.743 104216 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.744 104216 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.744 104216 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.744 104216 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.745 104216 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.745 104216 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.745 104216 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.746 104216 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.746 104216 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.746 104216 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.747 104216 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.747 104216 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.747 104216 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.748 104216 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.748 104216 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.748 104216 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.749 104216 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.749 104216 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.749 104216 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.750 104216 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.750 104216 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.751 104216 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.751 104216 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.752 104216 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.753 104216 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.753 104216 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.753 104216 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.753 104216 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.753 104216 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.754 104216 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.754 104216 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.754 104216 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.754 104216 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.754 104216 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.754 104216 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.754 104216 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.755 104216 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.755 104216 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.755 104216 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.755 104216 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.755 104216 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.756 104216 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.756 104216 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.756 104216 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.756 104216 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.756 104216 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.756 104216 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.757 104216 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.757 104216 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.757 104216 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.757 104216 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.757 104216 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.757 104216 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.758 104216 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.758 104216 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.758 104216 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.758 104216 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.758 104216 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.758 104216 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.759 104216 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.759 104216 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.759 104216 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.759 104216 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.759 104216 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.759 104216 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.759 104216 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.760 104216 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.760 104216 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.760 104216 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.760 104216 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.760 104216 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.760 104216 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.761 104216 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.762 104216 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.762 104216 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.762 104216 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.762 104216 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.762 104216 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.762 104216 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.762 104216 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.763 104216 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.763 104216 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.763 104216 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.763 104216 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.763 104216 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.763 104216 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.763 104216 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.764 104216 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.764 104216 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.764 104216 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.764 104216 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.764 104216 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.764 104216 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.764 104216 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.765 104216 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.765 104216 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.765 104216 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.765 104216 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.765 104216 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.766 104216 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.766 104216 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.766 104216 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.766 104216 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.766 104216 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.766 104216 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.767 104216 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.767 104216 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.767 104216 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.767 104216 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.768 104216 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.768 104216 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.768 104216 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.768 104216 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.768 104216 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.768 104216 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.768 104216 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.769 104216 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.769 104216 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.769 104216 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.769 104216 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.769 104216 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.769 104216 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.770 104216 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.770 104216 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.770 104216 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.770 104216 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.770 104216 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.770 104216 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.770 104216 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.771 104216 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.771 104216 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.771 104216 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.771 104216 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.771 104216 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.771 104216 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.771 104216 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.772 104216 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.772 104216 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.772 104216 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.772 104216 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.772 104216 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.772 104216 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.772 104216 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.773 104216 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.773 104216 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.773 104216 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.773 104216 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.773 104216 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.773 104216 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.773 104216 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.774 104216 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.774 104216 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.774 104216 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.774 104216 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.774 104216 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.774 104216 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.774 104216 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.775 104216 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.775 104216 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.775 104216 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.775 104216 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.775 104216 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.775 104216 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.776 104216 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.776 104216 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.776 104216 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.776 104216 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.776 104216 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.776 104216 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.777 104216 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.777 104216 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.777 104216 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.777 104216 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.777 104216 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.777 104216 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.777 104216 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.778 104216 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.778 104216 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.778 104216 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.778 104216 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.778 104216 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.778 104216 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.779 104216 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.779 104216 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.779 104216 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.779 104216 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.779 104216 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.779 104216 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.780 104216 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.780 104216 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.780 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.780 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.780 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.780 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.780 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.781 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.781 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.781 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.781 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.781 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.782 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.782 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.782 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.782 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.782 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.782 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.783 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.783 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.783 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.783 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.783 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.784 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.784 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.784 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.784 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.784 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.784 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.785 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.785 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.785 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.785 104216 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.785 104216 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.785 104216 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.786 104216 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.786 104216 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:49:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:49:19.786 104216 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:49:22 compute-0 sshd-session[104334]: Accepted publickey for zuul from 192.168.122.30 port 35646 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:49:22 compute-0 systemd-logind[819]: New session 22 of user zuul.
Nov 22 09:49:22 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 22 09:49:22 compute-0 sshd-session[104334]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:49:23 compute-0 python3.9[104487]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:49:24 compute-0 sudo[104642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irwndflnqyotpahkckvlyjysofjkipmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804964.2908587-34-197241741603847/AnsiballZ_command.py'
Nov 22 09:49:24 compute-0 sudo[104642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:24 compute-0 python3.9[104644]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:24 compute-0 sudo[104642]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:25 compute-0 sudo[104807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtmxmneldjqeikmxpkppwklefyegzpdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804965.319461-45-14182867654601/AnsiballZ_systemd_service.py'
Nov 22 09:49:25 compute-0 sudo[104807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:26 compute-0 python3.9[104809]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:49:26 compute-0 systemd[1]: Reloading.
Nov 22 09:49:26 compute-0 systemd-sysv-generator[104840]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:49:26 compute-0 systemd-rc-local-generator[104836]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:49:26 compute-0 sudo[104807]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:27 compute-0 python3.9[104994]: ansible-ansible.builtin.service_facts Invoked
Nov 22 09:49:27 compute-0 network[105011]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 09:49:27 compute-0 network[105012]: 'network-scripts' will be removed from distribution in near future.
Nov 22 09:49:27 compute-0 network[105013]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 09:49:32 compute-0 sudo[105272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reezfnolxesqblkvbhjeuaqoawslhzvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804971.7516894-64-56571062979164/AnsiballZ_systemd_service.py'
Nov 22 09:49:32 compute-0 sudo[105272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:32 compute-0 python3.9[105274]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:32 compute-0 sudo[105272]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:33 compute-0 sudo[105425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxeatmlcrkfnxzjctzzjjzrwqgllfrwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804972.6636946-64-75043226300184/AnsiballZ_systemd_service.py'
Nov 22 09:49:33 compute-0 sudo[105425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:33 compute-0 python3.9[105427]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:33 compute-0 sudo[105425]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:34 compute-0 sudo[105578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvdtfdrdmrqvrgjfcxtcoknxrdyuoecy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804973.6154828-64-87791571606846/AnsiballZ_systemd_service.py'
Nov 22 09:49:34 compute-0 sudo[105578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:34 compute-0 python3.9[105580]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:34 compute-0 sudo[105578]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:34 compute-0 sudo[105731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxovggpylygjhjnmnbndxbuhmjhqzbgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804974.5611722-64-148172034690673/AnsiballZ_systemd_service.py'
Nov 22 09:49:34 compute-0 sudo[105731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:35 compute-0 python3.9[105733]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:35 compute-0 sudo[105731]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:36 compute-0 sudo[105884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuqlkzyurvhggmoyngeifydzszflmqio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804975.6257498-64-208960609411131/AnsiballZ_systemd_service.py'
Nov 22 09:49:36 compute-0 sudo[105884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:36 compute-0 python3.9[105886]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:36 compute-0 sudo[105884]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:37 compute-0 sudo[106037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wriuyedoicjigtfcjocikwijslxhmbug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804976.6537757-64-216984669806111/AnsiballZ_systemd_service.py'
Nov 22 09:49:37 compute-0 sudo[106037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:37 compute-0 python3.9[106039]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:37 compute-0 sudo[106037]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:37 compute-0 sudo[106190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gytcnyojsuixeiuyvwvejxxqzttxwags ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804977.583563-64-48847860269503/AnsiballZ_systemd_service.py'
Nov 22 09:49:37 compute-0 sudo[106190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:38 compute-0 python3.9[106192]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:49:38 compute-0 sudo[106190]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:39 compute-0 sudo[106343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjoafqvqdikkfrzqmcgecwimzbummrnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804978.6910236-116-166439770200667/AnsiballZ_file.py'
Nov 22 09:49:39 compute-0 sudo[106343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:39 compute-0 python3.9[106345]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:39 compute-0 sudo[106343]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:40 compute-0 sudo[106506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-centekpzqzvotcguroyzagvsqskpzqfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804979.6549783-116-67568193619291/AnsiballZ_file.py'
Nov 22 09:49:40 compute-0 sudo[106506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:40 compute-0 podman[106469]: 2025-11-22 09:49:40.153485999 +0000 UTC m=+0.145854914 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:49:40 compute-0 python3.9[106514]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:40 compute-0 sudo[106506]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:40 compute-0 sudo[106674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owofurnzrchypzbevvivlgyzlfrtfqdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804980.4796-116-160977550507403/AnsiballZ_file.py'
Nov 22 09:49:40 compute-0 sudo[106674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:41 compute-0 python3.9[106676]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:41 compute-0 sudo[106674]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:41 compute-0 sudo[106826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoqokixazxuprxoxaijkueyhbvgidgfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804981.2159538-116-247005952498765/AnsiballZ_file.py'
Nov 22 09:49:41 compute-0 sudo[106826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:41 compute-0 python3.9[106828]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:41 compute-0 sudo[106826]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:42 compute-0 sudo[106978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llakrurytcuglnjyutvtpnzhgesdldyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804982.0297127-116-234670449749046/AnsiballZ_file.py'
Nov 22 09:49:42 compute-0 sudo[106978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:42 compute-0 python3.9[106980]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:42 compute-0 sudo[106978]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:43 compute-0 sudo[107130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bocydhingkgkemzancspjkitelfxkiqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804982.7373037-116-116653995369018/AnsiballZ_file.py'
Nov 22 09:49:43 compute-0 sudo[107130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:43 compute-0 python3.9[107132]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:43 compute-0 sudo[107130]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:43 compute-0 sudo[107282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtedxmradtyxkavvvdvidyuurssoajwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804983.4088075-116-239831243314289/AnsiballZ_file.py'
Nov 22 09:49:43 compute-0 sudo[107282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:43 compute-0 python3.9[107284]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:43 compute-0 sudo[107282]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:44 compute-0 sudo[107434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-harkdbxaavgctprqjezkbjlqygkswpin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804984.1205533-166-104104944056963/AnsiballZ_file.py'
Nov 22 09:49:44 compute-0 sudo[107434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:44 compute-0 python3.9[107436]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:44 compute-0 sudo[107434]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:45 compute-0 sudo[107586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apoykkhxfzwqxeuqmuopmnxwexrkcbud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804984.768704-166-261962279556991/AnsiballZ_file.py'
Nov 22 09:49:45 compute-0 sudo[107586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:45 compute-0 python3.9[107588]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:45 compute-0 sudo[107586]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:45 compute-0 sudo[107738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqvtyteczmxodnwtxeutgdjqjzmdckdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804985.4190874-166-51487694991499/AnsiballZ_file.py'
Nov 22 09:49:45 compute-0 sudo[107738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:45 compute-0 python3.9[107740]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:45 compute-0 sudo[107738]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:46 compute-0 sudo[107901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdendkljliurczpibqwivbufcvgqxoyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804986.1310494-166-253056992801187/AnsiballZ_file.py'
Nov 22 09:49:46 compute-0 sudo[107901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:46 compute-0 podman[107864]: 2025-11-22 09:49:46.501055098 +0000 UTC m=+0.074205798 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 09:49:46 compute-0 python3.9[107907]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:46 compute-0 sudo[107901]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:47 compute-0 sudo[108061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnpmrhdpwsekidsuovkcpsexhxkslqtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804986.840754-166-168676615214956/AnsiballZ_file.py'
Nov 22 09:49:47 compute-0 sudo[108061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:47 compute-0 python3.9[108063]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:47 compute-0 sudo[108061]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:47 compute-0 sudo[108213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gewfdwycfsklxmqeqoscdkspfbamwyjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804987.4399872-166-69244807638244/AnsiballZ_file.py'
Nov 22 09:49:47 compute-0 sudo[108213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:47 compute-0 python3.9[108215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:47 compute-0 sudo[108213]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:48 compute-0 sudo[108365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpsielikhheudsiuxgdolapcxiclnyau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804988.0609272-166-193777825801995/AnsiballZ_file.py'
Nov 22 09:49:48 compute-0 sudo[108365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:48 compute-0 python3.9[108367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:49:48 compute-0 sudo[108365]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:49 compute-0 sudo[108517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdsuimbdxvcqykaiermeiqlgcsyhpskf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804988.9165387-217-267932469541001/AnsiballZ_command.py'
Nov 22 09:49:49 compute-0 sudo[108517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:49 compute-0 python3.9[108519]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:49 compute-0 sudo[108517]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:50 compute-0 python3.9[108671]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 09:49:51 compute-0 sudo[108821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxfssifpesxyhqowqnevoxowqkiycdhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804990.881101-235-159970114778301/AnsiballZ_systemd_service.py'
Nov 22 09:49:51 compute-0 sudo[108821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:51 compute-0 python3.9[108823]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:49:51 compute-0 systemd[1]: Reloading.
Nov 22 09:49:51 compute-0 systemd-rc-local-generator[108851]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:49:51 compute-0 systemd-sysv-generator[108854]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:49:51 compute-0 sudo[108821]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:52 compute-0 sudo[109008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgbdgbxgjbdmuoboqzbxiyhdgdizqdal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804992.1106548-243-172913232732132/AnsiballZ_command.py'
Nov 22 09:49:52 compute-0 sudo[109008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:52 compute-0 python3.9[109010]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:52 compute-0 sudo[109008]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:53 compute-0 sudo[109161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgdbahgdpggbkbliyhiyhfftaqlpxymu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804992.9059556-243-143452457018275/AnsiballZ_command.py'
Nov 22 09:49:53 compute-0 sudo[109161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:53 compute-0 python3.9[109163]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:53 compute-0 sudo[109161]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:53 compute-0 sudo[109314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fptfyhtuxoqplxkxuihwokdcgimkmzuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804993.6239004-243-95228618528483/AnsiballZ_command.py'
Nov 22 09:49:53 compute-0 sudo[109314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:54 compute-0 python3.9[109316]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:54 compute-0 sudo[109314]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:54 compute-0 sudo[109467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkdeccjtcniuiabypxyphbnuwrwomxbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804994.5507746-243-139821471618602/AnsiballZ_command.py'
Nov 22 09:49:54 compute-0 sudo[109467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:55 compute-0 python3.9[109469]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:55 compute-0 sudo[109467]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:55 compute-0 sudo[109620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhrnvnglestzgupznzpfmivrpqnohzxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804995.3224318-243-68762477966106/AnsiballZ_command.py'
Nov 22 09:49:55 compute-0 sudo[109620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:55 compute-0 python3.9[109622]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:55 compute-0 sudo[109620]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:56 compute-0 sudo[109773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuyrybwaqpcsbgmisdxeysieiyaydgef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804996.0171533-243-177105791268689/AnsiballZ_command.py'
Nov 22 09:49:56 compute-0 sudo[109773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:56 compute-0 python3.9[109775]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:56 compute-0 sudo[109773]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:57 compute-0 sudo[109926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbdbfgobcotxkpwvijbvhlveivcassds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804996.7367284-243-53606485046158/AnsiballZ_command.py'
Nov 22 09:49:57 compute-0 sudo[109926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:57 compute-0 python3.9[109928]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:49:57 compute-0 sudo[109926]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:58 compute-0 sudo[110079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuljeeasyxlsmaarnreltadhkbftpvyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804997.8268733-297-203265078870425/AnsiballZ_getent.py'
Nov 22 09:49:58 compute-0 sudo[110079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:58 compute-0 python3.9[110081]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 22 09:49:58 compute-0 sudo[110079]: pam_unix(sudo:session): session closed for user root
Nov 22 09:49:59 compute-0 sudo[110232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrjbpbhawvmunupohiwvjgadnkoyiqho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804998.8375664-305-175205467690022/AnsiballZ_group.py'
Nov 22 09:49:59 compute-0 sudo[110232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:49:59 compute-0 python3.9[110234]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 09:49:59 compute-0 groupadd[110235]: group added to /etc/group: name=libvirt, GID=42473
Nov 22 09:49:59 compute-0 groupadd[110235]: group added to /etc/gshadow: name=libvirt
Nov 22 09:49:59 compute-0 groupadd[110235]: new group: name=libvirt, GID=42473
Nov 22 09:49:59 compute-0 sudo[110232]: pam_unix(sudo:session): session closed for user root
Nov 22 09:50:00 compute-0 sudo[110390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysdfulkfznyllulzqaokoahoyrsotros ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763804999.7421255-313-96747633690887/AnsiballZ_user.py'
Nov 22 09:50:00 compute-0 sudo[110390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:50:00 compute-0 python3.9[110392]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 09:50:00 compute-0 useradd[110394]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 09:50:00 compute-0 sudo[110390]: pam_unix(sudo:session): session closed for user root
Nov 22 09:50:01 compute-0 sudo[110550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lksudkybwnvzshlauwnceddexjcihwap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805001.0386233-324-279104204360647/AnsiballZ_setup.py'
Nov 22 09:50:01 compute-0 sudo[110550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:50:01 compute-0 python3.9[110552]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:50:02 compute-0 sudo[110550]: pam_unix(sudo:session): session closed for user root
Nov 22 09:50:02 compute-0 sudo[110634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amqyutmsthersyvkbgyqvmoulinxlbwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805001.0386233-324-279104204360647/AnsiballZ_dnf.py'
Nov 22 09:50:02 compute-0 sudo[110634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:50:02 compute-0 python3.9[110636]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:50:10 compute-0 podman[110667]: 2025-11-22 09:50:10.701358722 +0000 UTC m=+0.138662870 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 09:50:16 compute-0 podman[110847]: 2025-11-22 09:50:16.653531055 +0000 UTC m=+0.101700843 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 09:50:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:50:17.911 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:50:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:50:17.912 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:50:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:50:17.912 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:50:33 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 22 09:50:33 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:50:33 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 09:50:33 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:50:33 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:50:33 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:50:33 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:50:33 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:50:41 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 22 09:50:41 compute-0 podman[110879]: 2025-11-22 09:50:41.713697483 +0000 UTC m=+0.152002233 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 09:50:42 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 22 09:50:42 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:50:42 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 09:50:42 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:50:42 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:50:42 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:50:42 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:50:42 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:50:47 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 22 09:50:47 compute-0 podman[110913]: 2025-11-22 09:50:47.611227707 +0000 UTC m=+0.051781682 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 09:51:12 compute-0 podman[121578]: 2025-11-22 09:51:12.675694119 +0000 UTC m=+0.121464481 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:51:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:51:17.912 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:51:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:51:17.913 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:51:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:51:17.913 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:51:18 compute-0 podman[124624]: 2025-11-22 09:51:18.591651665 +0000 UTC m=+0.048881159 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 09:51:39 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Nov 22 09:51:39 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 09:51:39 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 09:51:39 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 09:51:39 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 09:51:39 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 09:51:39 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 09:51:39 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 09:51:41 compute-0 groupadd[127784]: group added to /etc/group: name=dnsmasq, GID=992
Nov 22 09:51:41 compute-0 groupadd[127784]: group added to /etc/gshadow: name=dnsmasq
Nov 22 09:51:41 compute-0 groupadd[127784]: new group: name=dnsmasq, GID=992
Nov 22 09:51:41 compute-0 useradd[127791]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 22 09:51:41 compute-0 dbus-broker-launch[812]: Noticed file-system modification, trigger reload.
Nov 22 09:51:41 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 22 09:51:41 compute-0 dbus-broker-launch[812]: Noticed file-system modification, trigger reload.
Nov 22 09:51:42 compute-0 groupadd[127804]: group added to /etc/group: name=clevis, GID=991
Nov 22 09:51:42 compute-0 groupadd[127804]: group added to /etc/gshadow: name=clevis
Nov 22 09:51:42 compute-0 groupadd[127804]: new group: name=clevis, GID=991
Nov 22 09:51:42 compute-0 useradd[127811]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 22 09:51:42 compute-0 usermod[127821]: add 'clevis' to group 'tss'
Nov 22 09:51:42 compute-0 usermod[127821]: add 'clevis' to shadow group 'tss'
Nov 22 09:51:43 compute-0 podman[127831]: 2025-11-22 09:51:43.003760648 +0000 UTC m=+0.120267888 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 09:51:44 compute-0 polkitd[43509]: Reloading rules
Nov 22 09:51:44 compute-0 polkitd[43509]: Collecting garbage unconditionally...
Nov 22 09:51:44 compute-0 polkitd[43509]: Loading rules from directory /etc/polkit-1/rules.d
Nov 22 09:51:44 compute-0 polkitd[43509]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 22 09:51:44 compute-0 polkitd[43509]: Finished loading, compiling and executing 3 rules
Nov 22 09:51:44 compute-0 polkitd[43509]: Reloading rules
Nov 22 09:51:44 compute-0 polkitd[43509]: Collecting garbage unconditionally...
Nov 22 09:51:44 compute-0 polkitd[43509]: Loading rules from directory /etc/polkit-1/rules.d
Nov 22 09:51:44 compute-0 polkitd[43509]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 22 09:51:44 compute-0 polkitd[43509]: Finished loading, compiling and executing 3 rules
Nov 22 09:51:45 compute-0 groupadd[128035]: group added to /etc/group: name=ceph, GID=167
Nov 22 09:51:45 compute-0 groupadd[128035]: group added to /etc/gshadow: name=ceph
Nov 22 09:51:45 compute-0 groupadd[128035]: new group: name=ceph, GID=167
Nov 22 09:51:46 compute-0 useradd[128041]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 22 09:51:48 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 22 09:51:48 compute-0 sshd[1005]: Received signal 15; terminating.
Nov 22 09:51:48 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 22 09:51:48 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 22 09:51:48 compute-0 systemd[1]: sshd.service: Consumed 1.354s CPU time, read 32.0K from disk, written 4.0K to disk.
Nov 22 09:51:48 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 22 09:51:48 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 22 09:51:48 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 09:51:48 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 09:51:48 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 09:51:48 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 22 09:51:48 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 22 09:51:48 compute-0 sshd[128570]: Server listening on 0.0.0.0 port 22.
Nov 22 09:51:48 compute-0 sshd[128570]: Server listening on :: port 22.
Nov 22 09:51:48 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 22 09:51:48 compute-0 podman[128558]: 2025-11-22 09:51:48.789777473 +0000 UTC m=+0.079567223 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 09:51:50 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:51:50 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:51:50 compute-0 systemd[1]: Reloading.
Nov 22 09:51:50 compute-0 systemd-rc-local-generator[128835]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:51:50 compute-0 systemd-sysv-generator[128839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:51:50 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:52:04 compute-0 sudo[110634]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:05 compute-0 sudo[136854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgeojxrtryqaksxjapchzuozzfzgxpce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805124.8798053-336-68041522657557/AnsiballZ_systemd.py'
Nov 22 09:52:05 compute-0 sudo[136854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:05 compute-0 python3.9[136880]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:52:05 compute-0 systemd[1]: Reloading.
Nov 22 09:52:05 compute-0 systemd-rc-local-generator[137350]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:05 compute-0 systemd-sysv-generator[137356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:06 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:52:06 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:52:06 compute-0 systemd[1]: man-db-cache-update.service: Consumed 12.123s CPU time.
Nov 22 09:52:06 compute-0 systemd[1]: run-re19fa109b49248d7ad135559b8e486b2.service: Deactivated successfully.
Nov 22 09:52:06 compute-0 sudo[136854]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:06 compute-0 sudo[137555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcgtqghisbddtbhvjsybahsakwpimgzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805126.2494547-336-260204380440571/AnsiballZ_systemd.py'
Nov 22 09:52:06 compute-0 sudo[137555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:06 compute-0 python3.9[137557]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:52:06 compute-0 systemd[1]: Reloading.
Nov 22 09:52:06 compute-0 systemd-rc-local-generator[137587]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:06 compute-0 systemd-sysv-generator[137590]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:07 compute-0 sudo[137555]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:07 compute-0 sudo[137745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcnxasnqwglyghbaicljncodbnjyopkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805127.3079321-336-50022349928002/AnsiballZ_systemd.py'
Nov 22 09:52:07 compute-0 sudo[137745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:07 compute-0 python3.9[137747]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:52:07 compute-0 systemd[1]: Reloading.
Nov 22 09:52:08 compute-0 systemd-sysv-generator[137782]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:08 compute-0 systemd-rc-local-generator[137777]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:08 compute-0 sudo[137745]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:08 compute-0 sudo[137935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kszefgzdionvqnqgqbrlwnfzexhzxvtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805128.3597705-336-47140913512362/AnsiballZ_systemd.py'
Nov 22 09:52:08 compute-0 sudo[137935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:08 compute-0 python3.9[137937]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:52:08 compute-0 systemd[1]: Reloading.
Nov 22 09:52:09 compute-0 systemd-rc-local-generator[137968]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:09 compute-0 systemd-sysv-generator[137972]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:09 compute-0 sudo[137935]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:09 compute-0 sudo[138126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whpqufrunzrjciynjnmdpltphyfqnlvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805129.4010048-365-98547873618653/AnsiballZ_systemd.py'
Nov 22 09:52:09 compute-0 sudo[138126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:10 compute-0 python3.9[138128]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:10 compute-0 systemd[1]: Reloading.
Nov 22 09:52:10 compute-0 systemd-sysv-generator[138162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:10 compute-0 systemd-rc-local-generator[138158]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:10 compute-0 sudo[138126]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:10 compute-0 sudo[138316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmlkfanhftfhdekugxfqkdbcewbvbydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805130.5032985-365-61455916355880/AnsiballZ_systemd.py'
Nov 22 09:52:10 compute-0 sudo[138316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:11 compute-0 python3.9[138318]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:11 compute-0 systemd[1]: Reloading.
Nov 22 09:52:11 compute-0 systemd-sysv-generator[138351]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:11 compute-0 systemd-rc-local-generator[138346]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:11 compute-0 sudo[138316]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:11 compute-0 sudo[138506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlqxjyjupfpjzmlawxqumivqzqkxcaew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805131.5686076-365-52400047381916/AnsiballZ_systemd.py'
Nov 22 09:52:11 compute-0 sudo[138506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:12 compute-0 python3.9[138508]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:12 compute-0 systemd[1]: Reloading.
Nov 22 09:52:12 compute-0 systemd-sysv-generator[138543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:12 compute-0 systemd-rc-local-generator[138537]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:12 compute-0 sudo[138506]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:12 compute-0 sudo[138697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzatwtnbxnpceauermaazomdqcnnannz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805132.6416936-365-573842821227/AnsiballZ_systemd.py'
Nov 22 09:52:12 compute-0 sudo[138697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:13 compute-0 python3.9[138699]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:13 compute-0 sudo[138697]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:13 compute-0 podman[138701]: 2025-11-22 09:52:13.468228191 +0000 UTC m=+0.103549100 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:52:13 compute-0 sudo[138878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilxiqnzwdusmnurwywwejvkrgxvmvvus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805133.569964-365-203991175612554/AnsiballZ_systemd.py'
Nov 22 09:52:13 compute-0 sudo[138878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:14 compute-0 python3.9[138880]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:14 compute-0 systemd[1]: Reloading.
Nov 22 09:52:14 compute-0 systemd-rc-local-generator[138907]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:14 compute-0 systemd-sysv-generator[138912]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:14 compute-0 sudo[138878]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:15 compute-0 sudo[139068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xirjgynqkvjmlohmwshxitbtomuhmxhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805134.8691952-401-83680861656487/AnsiballZ_systemd.py'
Nov 22 09:52:15 compute-0 sudo[139068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:15 compute-0 python3.9[139070]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 09:52:15 compute-0 systemd[1]: Reloading.
Nov 22 09:52:15 compute-0 systemd-rc-local-generator[139102]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:52:15 compute-0 systemd-sysv-generator[139105]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:52:16 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 22 09:52:16 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 22 09:52:16 compute-0 sudo[139068]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:16 compute-0 sudo[139262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpkbsprkrgiszsmgsizgyhfrxoeleziv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805136.3029401-409-136577058887741/AnsiballZ_systemd.py'
Nov 22 09:52:16 compute-0 sudo[139262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:16 compute-0 python3.9[139264]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:17 compute-0 sudo[139262]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:17 compute-0 sudo[139417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phgfrywdejmsahggpxuxdmckixbtbxmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805137.2391257-409-226876305140323/AnsiballZ_systemd.py'
Nov 22 09:52:17 compute-0 sudo[139417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:17 compute-0 python3.9[139419]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:52:17.915 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:52:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:52:17.915 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:52:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:52:17.915 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:52:17 compute-0 sudo[139417]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:18 compute-0 sudo[139572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsshyrxsxhfbzthzgfcjhqvkwipubloh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805138.1235-409-47571231821555/AnsiballZ_systemd.py'
Nov 22 09:52:18 compute-0 sudo[139572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:18 compute-0 python3.9[139574]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:18 compute-0 sudo[139572]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:19 compute-0 podman[139576]: 2025-11-22 09:52:19.005965197 +0000 UTC m=+0.089542841 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:52:19 compute-0 sudo[139746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpxpnjkbqcufhmbyvrlboeknculayrbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805139.151652-409-5960359171887/AnsiballZ_systemd.py'
Nov 22 09:52:19 compute-0 sudo[139746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:19 compute-0 python3.9[139748]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:20 compute-0 sudo[139746]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:20 compute-0 sudo[139901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbbcqfpffswqovsfipbklnzfiutlhgzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805140.172714-409-71555007438724/AnsiballZ_systemd.py'
Nov 22 09:52:20 compute-0 sudo[139901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:20 compute-0 python3.9[139903]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:20 compute-0 sudo[139901]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:21 compute-0 sudo[140056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkoxxsasevdqfopyvedqvrqdbwdgbuix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805141.1283364-409-216946515012314/AnsiballZ_systemd.py'
Nov 22 09:52:21 compute-0 sudo[140056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:21 compute-0 python3.9[140058]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:22 compute-0 sudo[140056]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:22 compute-0 sudo[140211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woaxufqwduwzsvjmzffnqvrvneuyqfvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805142.2018707-409-182176546584331/AnsiballZ_systemd.py'
Nov 22 09:52:22 compute-0 sudo[140211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:22 compute-0 python3.9[140213]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:23 compute-0 sudo[140211]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:23 compute-0 sudo[140366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urfypocrnrlabzjcgeosfjgewfipzlze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805143.2698119-409-97356808849270/AnsiballZ_systemd.py'
Nov 22 09:52:23 compute-0 sudo[140366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:24 compute-0 python3.9[140368]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:24 compute-0 sudo[140366]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:24 compute-0 sudo[140521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdowfwbtzlolukazgwokpsvanzyvviag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805144.3884068-409-262447178408288/AnsiballZ_systemd.py'
Nov 22 09:52:24 compute-0 sudo[140521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:25 compute-0 python3.9[140523]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:25 compute-0 sudo[140521]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:25 compute-0 sudo[140676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwbltggeuzrrevycajvmyxzhgqxbjcty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805145.307859-409-199516591273452/AnsiballZ_systemd.py'
Nov 22 09:52:25 compute-0 sudo[140676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:26 compute-0 python3.9[140678]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:26 compute-0 sudo[140676]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:26 compute-0 sudo[140831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyeijiwwwuvuhwepshkakmywrpdvucde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805146.3926673-409-240212141284239/AnsiballZ_systemd.py'
Nov 22 09:52:26 compute-0 sudo[140831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:27 compute-0 python3.9[140833]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:28 compute-0 sudo[140831]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:28 compute-0 sudo[140986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpsiqcdifsiohuirzqgftixahqyghujx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805148.4062855-409-174391665751763/AnsiballZ_systemd.py'
Nov 22 09:52:28 compute-0 sudo[140986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:29 compute-0 python3.9[140988]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:29 compute-0 sudo[140986]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:29 compute-0 sudo[141141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyxznvpdqefmmqyylwjzopelnroussyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805149.5582302-409-58485318231669/AnsiballZ_systemd.py'
Nov 22 09:52:29 compute-0 sudo[141141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:30 compute-0 python3.9[141143]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:30 compute-0 sudo[141141]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:30 compute-0 sudo[141296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujqknhiqxxhxbfmncyohnqjaiqrnogsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805150.5606747-409-213124079519585/AnsiballZ_systemd.py'
Nov 22 09:52:30 compute-0 sudo[141296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:31 compute-0 python3.9[141298]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 09:52:31 compute-0 sudo[141296]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:32 compute-0 sudo[141451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-risnidhuzaeuacpznerstlxkxlcfxvzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805151.7097824-511-197442057652523/AnsiballZ_file.py'
Nov 22 09:52:32 compute-0 sudo[141451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:32 compute-0 python3.9[141453]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:52:32 compute-0 sudo[141451]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:32 compute-0 sudo[141603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzupbajlklfzgkapemjunpyyiuifamnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805152.5135071-511-200915999458840/AnsiballZ_file.py'
Nov 22 09:52:32 compute-0 sudo[141603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:32 compute-0 python3.9[141605]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:52:33 compute-0 sudo[141603]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:33 compute-0 sudo[141755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvqgsvswwihdyhruxrbqpoexgrhsssav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805153.1983004-511-248389381092932/AnsiballZ_file.py'
Nov 22 09:52:33 compute-0 sudo[141755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:33 compute-0 python3.9[141757]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:52:33 compute-0 sudo[141755]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:34 compute-0 sudo[141907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrvwjtzmwoapltgivxcwnjmrolvwqykj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805153.8811715-511-245492196456989/AnsiballZ_file.py'
Nov 22 09:52:34 compute-0 sudo[141907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:34 compute-0 python3.9[141909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:52:34 compute-0 sudo[141907]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:34 compute-0 sudo[142059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdqrhpqplojcwfssrnzrhqzqzyymfsuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805154.6223953-511-243796052104622/AnsiballZ_file.py'
Nov 22 09:52:34 compute-0 sudo[142059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:35 compute-0 python3.9[142061]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:52:35 compute-0 sudo[142059]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:35 compute-0 sudo[142211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzpuohffvcbuurubuvquntkdxxrpaceg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805155.3229034-511-275988172380841/AnsiballZ_file.py'
Nov 22 09:52:35 compute-0 sudo[142211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:35 compute-0 python3.9[142213]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:52:35 compute-0 sudo[142211]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:36 compute-0 sudo[142363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzschbunnlofbgghqqnwqsgbhoyzzius ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805156.1580095-554-280171971241956/AnsiballZ_stat.py'
Nov 22 09:52:36 compute-0 sudo[142363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:36 compute-0 python3.9[142365]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:36 compute-0 sudo[142363]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:37 compute-0 sudo[142488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shvfcirdzxcjegsubxumhwtywrytsfhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805156.1580095-554-280171971241956/AnsiballZ_copy.py'
Nov 22 09:52:37 compute-0 sudo[142488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:37 compute-0 python3.9[142490]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805156.1580095-554-280171971241956/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:37 compute-0 sudo[142488]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:38 compute-0 sudo[142640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdtigzbadayvhlghfkrafccajuaswldv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805157.9487133-554-134823567535061/AnsiballZ_stat.py'
Nov 22 09:52:38 compute-0 sudo[142640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:38 compute-0 python3.9[142642]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:38 compute-0 sudo[142640]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:38 compute-0 sudo[142765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvavalaefxhfnzkhgblfwjhtpkolaazo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805157.9487133-554-134823567535061/AnsiballZ_copy.py'
Nov 22 09:52:38 compute-0 sudo[142765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:39 compute-0 python3.9[142767]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805157.9487133-554-134823567535061/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:39 compute-0 sudo[142765]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:39 compute-0 sudo[142917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmbyjfcwkjpopxzxodisosybwwldgzcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805159.3444607-554-169250643652880/AnsiballZ_stat.py'
Nov 22 09:52:39 compute-0 sudo[142917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:39 compute-0 python3.9[142919]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:39 compute-0 sudo[142917]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:40 compute-0 sudo[143042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuxpmtqopvmkgfuqlvaodtjlwhublktd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805159.3444607-554-169250643652880/AnsiballZ_copy.py'
Nov 22 09:52:40 compute-0 sudo[143042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:40 compute-0 python3.9[143044]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805159.3444607-554-169250643652880/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:40 compute-0 sudo[143042]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:40 compute-0 sudo[143194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vryckupgpjanganngmlzkifixmdatmaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805160.6397896-554-62772427440652/AnsiballZ_stat.py'
Nov 22 09:52:40 compute-0 sudo[143194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:41 compute-0 python3.9[143196]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:41 compute-0 sudo[143194]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:41 compute-0 sudo[143319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkxeklrjnrtwmfprggapopyonrslwlcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805160.6397896-554-62772427440652/AnsiballZ_copy.py'
Nov 22 09:52:41 compute-0 sudo[143319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:41 compute-0 python3.9[143321]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805160.6397896-554-62772427440652/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:41 compute-0 sudo[143319]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:42 compute-0 sudo[143471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cddkhyoxmwqnckcwndijiksgzojnpzem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805162.0835319-554-197141986123243/AnsiballZ_stat.py'
Nov 22 09:52:42 compute-0 sudo[143471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:42 compute-0 python3.9[143473]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:42 compute-0 sudo[143471]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:43 compute-0 sudo[143596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsrudwpjosaepgaaglvymsrjodjcqxwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805162.0835319-554-197141986123243/AnsiballZ_copy.py'
Nov 22 09:52:43 compute-0 sudo[143596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:43 compute-0 python3.9[143598]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805162.0835319-554-197141986123243/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:43 compute-0 sudo[143596]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:43 compute-0 podman[143630]: 2025-11-22 09:52:43.685939344 +0000 UTC m=+0.133637397 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:52:43 compute-0 sudo[143775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqibnpaenelzpbkaqqwbdznwzqllidez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805163.557475-554-225162415758850/AnsiballZ_stat.py'
Nov 22 09:52:43 compute-0 sudo[143775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:44 compute-0 python3.9[143777]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:44 compute-0 sudo[143775]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:44 compute-0 sudo[143900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbamkirbdawjprcwegyzdtzdoajdpdxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805163.557475-554-225162415758850/AnsiballZ_copy.py'
Nov 22 09:52:44 compute-0 sudo[143900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:44 compute-0 python3.9[143902]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805163.557475-554-225162415758850/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:44 compute-0 sudo[143900]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:45 compute-0 sudo[144052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytkxxqrtymtbofiwgxijkyjkapsmipaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805164.9878862-554-108072041413515/AnsiballZ_stat.py'
Nov 22 09:52:45 compute-0 sudo[144052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:45 compute-0 python3.9[144054]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:45 compute-0 sudo[144052]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:46 compute-0 sudo[144175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geblrrwcdgsailcmscwmxbhhqauqaygz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805164.9878862-554-108072041413515/AnsiballZ_copy.py'
Nov 22 09:52:46 compute-0 sudo[144175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:46 compute-0 python3.9[144177]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805164.9878862-554-108072041413515/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:46 compute-0 sudo[144175]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:46 compute-0 sudo[144327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwxaekefcwnyjiwssymxliiixwzezpfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805166.4404294-554-42428522411144/AnsiballZ_stat.py'
Nov 22 09:52:46 compute-0 sudo[144327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:46 compute-0 python3.9[144329]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:46 compute-0 sudo[144327]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:47 compute-0 sudo[144452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmhjabxpnhejsjvxihswvxfcqxfjpnnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805166.4404294-554-42428522411144/AnsiballZ_copy.py'
Nov 22 09:52:47 compute-0 sudo[144452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:47 compute-0 python3.9[144454]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763805166.4404294-554-42428522411144/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:47 compute-0 sudo[144452]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:48 compute-0 sudo[144604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iellmsmvxoevqlzuxoovyridhpvnbhjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805167.7171159-667-29430165711915/AnsiballZ_command.py'
Nov 22 09:52:48 compute-0 sudo[144604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:48 compute-0 python3.9[144606]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 22 09:52:48 compute-0 sudo[144604]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:48 compute-0 sudo[144757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lshrwysoywmqkzfgkggfqhadaoelbsev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805168.4531014-676-119674957287698/AnsiballZ_file.py'
Nov 22 09:52:48 compute-0 sudo[144757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:48 compute-0 python3.9[144759]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:48 compute-0 sudo[144757]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:49 compute-0 sudo[144923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idgznovdqgxucnvhjmklcrchkfjzbdjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805169.078955-676-178000035663081/AnsiballZ_file.py'
Nov 22 09:52:49 compute-0 sudo[144923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:49 compute-0 podman[144883]: 2025-11-22 09:52:49.387673938 +0000 UTC m=+0.053546958 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 09:52:49 compute-0 python3.9[144931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:49 compute-0 sudo[144923]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:50 compute-0 sudo[145081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slafowlsgpkomhsndboldwcaycjzkoam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805169.7116594-676-61491508078363/AnsiballZ_file.py'
Nov 22 09:52:50 compute-0 sudo[145081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:50 compute-0 python3.9[145083]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:50 compute-0 sudo[145081]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:50 compute-0 sudo[145233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehmgbvefixhpitysclhetrypjzuyzslw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805170.4038963-676-122275952499448/AnsiballZ_file.py'
Nov 22 09:52:50 compute-0 sudo[145233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:50 compute-0 python3.9[145235]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:51 compute-0 sudo[145233]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:51 compute-0 sudo[145385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhkhmbzdqcvwssmjdezxhxlzzrepqcnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805171.1670732-676-263700768811173/AnsiballZ_file.py'
Nov 22 09:52:51 compute-0 sudo[145385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:51 compute-0 python3.9[145387]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:51 compute-0 sudo[145385]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:52 compute-0 sudo[145537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oslshllqzbhgmmgtznqwqmpyxkxasuij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805171.893371-676-63014212805209/AnsiballZ_file.py'
Nov 22 09:52:52 compute-0 sudo[145537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:52 compute-0 python3.9[145539]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:52 compute-0 sudo[145537]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:52 compute-0 sudo[145689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvyrkkulpwwehctvgmnvqsbcyxhhbarj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805172.6004813-676-256440427557541/AnsiballZ_file.py'
Nov 22 09:52:52 compute-0 sudo[145689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:53 compute-0 python3.9[145691]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:53 compute-0 sudo[145689]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:53 compute-0 sudo[145841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpedmifstzfpyprqfckzagypaqctpdxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805173.2668304-676-5391289422493/AnsiballZ_file.py'
Nov 22 09:52:53 compute-0 sudo[145841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:53 compute-0 python3.9[145843]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:53 compute-0 sudo[145841]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:54 compute-0 sudo[145993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiufenlvwlsixadugagletaimtyvqgkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805174.0792506-676-206618839845218/AnsiballZ_file.py'
Nov 22 09:52:54 compute-0 sudo[145993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:54 compute-0 python3.9[145995]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:54 compute-0 sudo[145993]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:55 compute-0 sudo[146145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpkzwtvjodjvokdhtiriytremfdqmhxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805174.7356844-676-111585038568833/AnsiballZ_file.py'
Nov 22 09:52:55 compute-0 sudo[146145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:55 compute-0 python3.9[146147]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:55 compute-0 sudo[146145]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:55 compute-0 sudo[146297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stmsmkspwhblofghtwmehhauxfidayke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805175.5087583-676-106910329342816/AnsiballZ_file.py'
Nov 22 09:52:55 compute-0 sudo[146297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:56 compute-0 python3.9[146299]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:56 compute-0 sudo[146297]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:56 compute-0 sudo[146449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrabtdfvkfdrcoqxcmcldwcpsloqexxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805176.1742492-676-45075820727703/AnsiballZ_file.py'
Nov 22 09:52:56 compute-0 sudo[146449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:56 compute-0 python3.9[146451]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:56 compute-0 sudo[146449]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:57 compute-0 sudo[146601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpcovaebxeyzgddnivolqdwdxkulpzof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805177.008773-676-142257662926451/AnsiballZ_file.py'
Nov 22 09:52:57 compute-0 sudo[146601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:57 compute-0 python3.9[146603]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:57 compute-0 sudo[146601]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:58 compute-0 sudo[146753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbvzhaacjuqwxutsowabsqwvyzorksgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805177.7277977-676-229201890785653/AnsiballZ_file.py'
Nov 22 09:52:58 compute-0 sudo[146753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:58 compute-0 python3.9[146755]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:58 compute-0 sudo[146753]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:58 compute-0 sudo[146905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfemimpdrfjslnujyvpufdclxwjkkrpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805178.5413346-775-61561050155227/AnsiballZ_stat.py'
Nov 22 09:52:58 compute-0 sudo[146905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:59 compute-0 python3.9[146907]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:52:59 compute-0 sudo[146905]: pam_unix(sudo:session): session closed for user root
Nov 22 09:52:59 compute-0 sudo[147028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqzofrvvcacwcfldnwmdvsziiolcbiqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805178.5413346-775-61561050155227/AnsiballZ_copy.py'
Nov 22 09:52:59 compute-0 sudo[147028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:52:59 compute-0 python3.9[147030]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805178.5413346-775-61561050155227/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:52:59 compute-0 sudo[147028]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:00 compute-0 sudo[147180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yljqmqxzjlekwcovgikxlyhuaecbytvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805179.9478314-775-45524621125953/AnsiballZ_stat.py'
Nov 22 09:53:00 compute-0 sudo[147180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:00 compute-0 python3.9[147182]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:00 compute-0 sudo[147180]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:00 compute-0 sudo[147303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beqskatmwwfuohuyhwwbeahiczisfmno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805179.9478314-775-45524621125953/AnsiballZ_copy.py'
Nov 22 09:53:00 compute-0 sudo[147303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:01 compute-0 python3.9[147305]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805179.9478314-775-45524621125953/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:01 compute-0 sudo[147303]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:01 compute-0 sudo[147455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkfybmilosftpaivmetwcplomkgogczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805181.3886638-775-127410428009230/AnsiballZ_stat.py'
Nov 22 09:53:01 compute-0 sudo[147455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:01 compute-0 python3.9[147457]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:01 compute-0 sudo[147455]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:02 compute-0 sudo[147578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oldxizkehknxmzrvrdeejeyfcjummdow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805181.3886638-775-127410428009230/AnsiballZ_copy.py'
Nov 22 09:53:02 compute-0 sudo[147578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:02 compute-0 python3.9[147580]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805181.3886638-775-127410428009230/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:02 compute-0 sudo[147578]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:03 compute-0 sudo[147730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjdvkpoekvqojrnehztrfsyyoktrfgmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805182.7723975-775-189337044936932/AnsiballZ_stat.py'
Nov 22 09:53:03 compute-0 sudo[147730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:03 compute-0 python3.9[147732]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:03 compute-0 sudo[147730]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:03 compute-0 sudo[147853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsykkdqmjvrsmwnqlbzwritnlehongqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805182.7723975-775-189337044936932/AnsiballZ_copy.py'
Nov 22 09:53:03 compute-0 sudo[147853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:04 compute-0 python3.9[147855]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805182.7723975-775-189337044936932/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:04 compute-0 sudo[147853]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:04 compute-0 sudo[148005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yotbifclgkyhsqqerxezhgxuhqihnvhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805184.345092-775-279587715060269/AnsiballZ_stat.py'
Nov 22 09:53:04 compute-0 sudo[148005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:04 compute-0 python3.9[148007]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:04 compute-0 sudo[148005]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:05 compute-0 sudo[148128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyepcjdppnuyerqfffuywcjvnpladrly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805184.345092-775-279587715060269/AnsiballZ_copy.py'
Nov 22 09:53:05 compute-0 sudo[148128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:05 compute-0 python3.9[148130]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805184.345092-775-279587715060269/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:05 compute-0 sudo[148128]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:06 compute-0 sudo[148280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ashjjbvyalbhhkwwujvyrkafaxykenpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805185.8537595-775-271910767767704/AnsiballZ_stat.py'
Nov 22 09:53:06 compute-0 sudo[148280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:06 compute-0 python3.9[148282]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:06 compute-0 sudo[148280]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:06 compute-0 sudo[148403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lscepvqhbpqeweafcvenkpxdnomvlvax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805185.8537595-775-271910767767704/AnsiballZ_copy.py'
Nov 22 09:53:06 compute-0 sudo[148403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:07 compute-0 python3.9[148405]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805185.8537595-775-271910767767704/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:07 compute-0 sudo[148403]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:07 compute-0 sudo[148555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apdvqnfoqdpfyjzsrrcncerikyseteae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805187.3305268-775-109913754651995/AnsiballZ_stat.py'
Nov 22 09:53:07 compute-0 sudo[148555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:07 compute-0 python3.9[148557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:07 compute-0 sudo[148555]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:08 compute-0 sudo[148678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eslspxccruguwafyvfrbqzyntcczmoxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805187.3305268-775-109913754651995/AnsiballZ_copy.py'
Nov 22 09:53:08 compute-0 sudo[148678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:08 compute-0 python3.9[148680]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805187.3305268-775-109913754651995/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:08 compute-0 sudo[148678]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:09 compute-0 sudo[148830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbvmwbbylvhhaqeaylusvcgojptciktq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805188.8262062-775-40387412210093/AnsiballZ_stat.py'
Nov 22 09:53:09 compute-0 sudo[148830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:09 compute-0 python3.9[148832]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:09 compute-0 sudo[148830]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:09 compute-0 sudo[148953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhyavpqwcnjzsitnexxmorcxiualwxdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805188.8262062-775-40387412210093/AnsiballZ_copy.py'
Nov 22 09:53:09 compute-0 sudo[148953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:10 compute-0 python3.9[148955]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805188.8262062-775-40387412210093/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:10 compute-0 sudo[148953]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:10 compute-0 sudo[149105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lphfqgatqlqcnptnlsmfpusirxyeyvui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805190.3979342-775-153269309346113/AnsiballZ_stat.py'
Nov 22 09:53:10 compute-0 sudo[149105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:11 compute-0 python3.9[149107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:11 compute-0 sudo[149105]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:11 compute-0 sudo[149228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhwxsoiycqkllrdmgtmpknaisuuupvgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805190.3979342-775-153269309346113/AnsiballZ_copy.py'
Nov 22 09:53:11 compute-0 sudo[149228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:11 compute-0 python3.9[149230]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805190.3979342-775-153269309346113/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:11 compute-0 sudo[149228]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:12 compute-0 sudo[149380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhhzfwvfglwjouwdlyrnnrbdojdygkcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805191.8410106-775-133922798283016/AnsiballZ_stat.py'
Nov 22 09:53:12 compute-0 sudo[149380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:12 compute-0 python3.9[149382]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:12 compute-0 sudo[149380]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:12 compute-0 sudo[149503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbthhhrhwyktbgavxrigshykctimriip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805191.8410106-775-133922798283016/AnsiballZ_copy.py'
Nov 22 09:53:12 compute-0 sudo[149503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:13 compute-0 python3.9[149505]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805191.8410106-775-133922798283016/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:13 compute-0 sudo[149503]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:13 compute-0 sudo[149655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkhvksjaoyyqsvrvarnmuqkkynmcoomh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805193.282503-775-156565062473807/AnsiballZ_stat.py'
Nov 22 09:53:13 compute-0 sudo[149655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:13 compute-0 python3.9[149657]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:13 compute-0 sudo[149655]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:14 compute-0 sudo[149797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfxqutnyrmhtrgifwtwgwwmgpxgsvrbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805193.282503-775-156565062473807/AnsiballZ_copy.py'
Nov 22 09:53:14 compute-0 sudo[149797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:14 compute-0 podman[149752]: 2025-11-22 09:53:14.236588259 +0000 UTC m=+0.074032523 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 09:53:14 compute-0 python3.9[149803]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805193.282503-775-156565062473807/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:14 compute-0 sudo[149797]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:14 compute-0 sudo[149956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kopksjkqhoggvxysotdhybputjfjiaag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805194.6403441-775-217657832887321/AnsiballZ_stat.py'
Nov 22 09:53:14 compute-0 sudo[149956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:15 compute-0 python3.9[149958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:15 compute-0 sudo[149956]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:15 compute-0 sudo[150079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mueybwbeyojyxoxmuofpkojdjnlrxbzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805194.6403441-775-217657832887321/AnsiballZ_copy.py'
Nov 22 09:53:15 compute-0 sudo[150079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:15 compute-0 python3.9[150081]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805194.6403441-775-217657832887321/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:15 compute-0 sudo[150079]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:16 compute-0 sudo[150231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtqzcnpcfgzdpobgondzcgyskwanqrut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805195.8827674-775-9824038460710/AnsiballZ_stat.py'
Nov 22 09:53:16 compute-0 sudo[150231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:16 compute-0 python3.9[150233]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:16 compute-0 sudo[150231]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:16 compute-0 sudo[150354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovmsmdfomkyacrbyevazboxbcbdpzkth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805195.8827674-775-9824038460710/AnsiballZ_copy.py'
Nov 22 09:53:16 compute-0 sudo[150354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:17 compute-0 python3.9[150356]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805195.8827674-775-9824038460710/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:17 compute-0 sudo[150354]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:17 compute-0 sudo[150506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptqnnuakkihhuknsjgsxxhwemiodown ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805197.171769-775-229066611987254/AnsiballZ_stat.py'
Nov 22 09:53:17 compute-0 sudo[150506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:17 compute-0 python3.9[150508]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:17 compute-0 sudo[150506]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:53:17.915 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:53:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:53:17.916 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:53:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:53:17.916 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:53:18 compute-0 sudo[150629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okdxoxvfeyawbkgqxoocpupbnskcfpqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805197.171769-775-229066611987254/AnsiballZ_copy.py'
Nov 22 09:53:18 compute-0 sudo[150629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:18 compute-0 python3.9[150631]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805197.171769-775-229066611987254/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:18 compute-0 sudo[150629]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:18 compute-0 python3.9[150781]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:53:19 compute-0 podman[150861]: 2025-11-22 09:53:19.598397336 +0000 UTC m=+0.049766527 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 09:53:19 compute-0 sudo[150953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmggpoatqcyieidiikkabxujysvhwhto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805199.2869346-981-90337390653858/AnsiballZ_seboolean.py'
Nov 22 09:53:19 compute-0 sudo[150953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:19 compute-0 python3.9[150955]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 22 09:53:21 compute-0 sudo[150953]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:21 compute-0 sudo[151109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxqazvbefaxehtbbhvxjaegmrgpnruxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805201.447844-989-6655372360391/AnsiballZ_copy.py'
Nov 22 09:53:21 compute-0 dbus-broker-launch[813]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 22 09:53:21 compute-0 sudo[151109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:21 compute-0 python3.9[151111]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:21 compute-0 sudo[151109]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:22 compute-0 sudo[151261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upffwubtqepowsxtwxkjvfdxcfnklcou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805202.1050267-989-237999307230855/AnsiballZ_copy.py'
Nov 22 09:53:22 compute-0 sudo[151261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:22 compute-0 python3.9[151263]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:22 compute-0 sudo[151261]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:23 compute-0 sudo[151413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xenfthsobelfbubigywgvkjwhiawuang ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805202.7791636-989-65961021583413/AnsiballZ_copy.py'
Nov 22 09:53:23 compute-0 sudo[151413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:23 compute-0 python3.9[151415]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:23 compute-0 sudo[151413]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:23 compute-0 sudo[151565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kznhttssxprrhyilvgdohjrpyprimygr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805203.4488797-989-173352582692183/AnsiballZ_copy.py'
Nov 22 09:53:23 compute-0 sudo[151565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:24 compute-0 python3.9[151567]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:24 compute-0 sudo[151565]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:24 compute-0 sudo[151717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkwqmoileqgplmqdiyhevnnbqrbjaowr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805204.248627-989-105034972326409/AnsiballZ_copy.py'
Nov 22 09:53:24 compute-0 sudo[151717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:24 compute-0 python3.9[151719]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:24 compute-0 sudo[151717]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:25 compute-0 sudo[151869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbnraclxfemjpedjjrswqwbvbvqdzjak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805205.044522-1025-95784743484340/AnsiballZ_copy.py'
Nov 22 09:53:25 compute-0 sudo[151869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:25 compute-0 python3.9[151871]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:25 compute-0 sudo[151869]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:26 compute-0 sudo[152021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgwhmtglixznqhxdkrgkozsgvridlwnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805205.820076-1025-272307035245387/AnsiballZ_copy.py'
Nov 22 09:53:26 compute-0 sudo[152021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:26 compute-0 python3.9[152023]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:26 compute-0 sudo[152021]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:26 compute-0 sudo[152173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdgkcxaahnvvmwgldmuxpyetklokrrxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805206.4662738-1025-232848309761609/AnsiballZ_copy.py'
Nov 22 09:53:26 compute-0 sudo[152173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:26 compute-0 python3.9[152175]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:27 compute-0 sudo[152173]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:27 compute-0 sudo[152325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkvcpfucdwdrtvixyefsyxgrgrqoprfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805207.2183568-1025-206070676359333/AnsiballZ_copy.py'
Nov 22 09:53:27 compute-0 sudo[152325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:27 compute-0 python3.9[152327]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:27 compute-0 sudo[152325]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:28 compute-0 sudo[152477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-satbelhexrdnxxfczplkhaulgyauxqha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805208.0537775-1025-34795751966223/AnsiballZ_copy.py'
Nov 22 09:53:28 compute-0 sudo[152477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:28 compute-0 python3.9[152479]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:28 compute-0 sudo[152477]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:29 compute-0 sudo[152629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyjniftnfoxmqbibwuagxqyjwlryxjvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805208.9475086-1061-134663836598890/AnsiballZ_systemd.py'
Nov 22 09:53:29 compute-0 sudo[152629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:29 compute-0 python3.9[152631]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:53:29 compute-0 systemd[1]: Reloading.
Nov 22 09:53:29 compute-0 systemd-rc-local-generator[152656]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:53:29 compute-0 systemd-sysv-generator[152661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:53:30 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 22 09:53:30 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 22 09:53:30 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 22 09:53:30 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 22 09:53:30 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 22 09:53:30 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 22 09:53:30 compute-0 sudo[152629]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:30 compute-0 sudo[152823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojbsvhpwlrmkrlfdrkceghqkgfsaqfyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805210.3506856-1061-82222207047962/AnsiballZ_systemd.py'
Nov 22 09:53:30 compute-0 sudo[152823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:31 compute-0 python3.9[152825]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:53:31 compute-0 systemd[1]: Reloading.
Nov 22 09:53:31 compute-0 systemd-sysv-generator[152856]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:53:31 compute-0 systemd-rc-local-generator[152853]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:53:31 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 22 09:53:31 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 22 09:53:31 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 22 09:53:31 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 22 09:53:31 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 22 09:53:31 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 22 09:53:31 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 09:53:31 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 22 09:53:31 compute-0 sudo[152823]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:32 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 22 09:53:32 compute-0 sudo[153040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nklgezqdqafbfacykrcuefemofmsrjmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805211.7584248-1061-47834963298408/AnsiballZ_systemd.py'
Nov 22 09:53:32 compute-0 sudo[153040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:32 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 22 09:53:32 compute-0 python3.9[153042]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:53:32 compute-0 systemd[1]: Reloading.
Nov 22 09:53:32 compute-0 systemd-sysv-generator[153075]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:53:32 compute-0 systemd-rc-local-generator[153070]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:53:32 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 22 09:53:32 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 22 09:53:32 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 22 09:53:32 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 22 09:53:32 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 22 09:53:32 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 22 09:53:32 compute-0 sudo[153040]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:33 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 22 09:53:33 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 22 09:53:33 compute-0 sudo[153260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbbssabfnisptlqlqxfunynyluvusgge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805213.0317929-1061-247885411686406/AnsiballZ_systemd.py'
Nov 22 09:53:33 compute-0 sudo[153260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:33 compute-0 python3.9[153262]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:53:33 compute-0 systemd[1]: Reloading.
Nov 22 09:53:33 compute-0 systemd-rc-local-generator[153285]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:53:33 compute-0 systemd-sysv-generator[153288]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:53:34 compute-0 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l ef8e65cd-8021-4b48-b585-5778542cb467
Nov 22 09:53:34 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 22 09:53:34 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 22 09:53:34 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 22 09:53:34 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 22 09:53:34 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 22 09:53:34 compute-0 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 22 09:53:34 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 22 09:53:34 compute-0 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l ef8e65cd-8021-4b48-b585-5778542cb467
Nov 22 09:53:34 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 22 09:53:34 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 22 09:53:34 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 22 09:53:34 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 22 09:53:34 compute-0 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 22 09:53:34 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 09:53:34 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 22 09:53:34 compute-0 sudo[153260]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:34 compute-0 sudo[153476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqxoqqbhcicfdhvkisqelhnegatlboqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805214.2845945-1061-151532243311789/AnsiballZ_systemd.py'
Nov 22 09:53:34 compute-0 sudo[153476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:34 compute-0 python3.9[153478]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:53:34 compute-0 systemd[1]: Reloading.
Nov 22 09:53:35 compute-0 systemd-rc-local-generator[153499]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:53:35 compute-0 systemd-sysv-generator[153505]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:53:35 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 22 09:53:35 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 22 09:53:35 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 22 09:53:35 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 22 09:53:35 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 22 09:53:35 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 22 09:53:35 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 22 09:53:35 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 22 09:53:35 compute-0 sudo[153476]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:36 compute-0 sudo[153689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukbnrpydnvaxrpoolpbswpqgurpzwumn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805215.7970665-1098-213750623835291/AnsiballZ_file.py'
Nov 22 09:53:36 compute-0 sudo[153689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:36 compute-0 python3.9[153691]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:36 compute-0 sudo[153689]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:37 compute-0 sudo[153841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okjpnhuwsdrsaygygesozzrhebbaofma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805216.6904671-1106-81751201707748/AnsiballZ_find.py'
Nov 22 09:53:37 compute-0 sudo[153841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:37 compute-0 python3.9[153843]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 09:53:37 compute-0 sudo[153841]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:38 compute-0 sudo[153993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igzwucxodozhnpmhfbiwwdpvzljwqpxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805217.9508529-1120-114710167739052/AnsiballZ_stat.py'
Nov 22 09:53:38 compute-0 sudo[153993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:38 compute-0 python3.9[153995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:38 compute-0 sudo[153993]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:38 compute-0 sudo[154116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfesjakpefwcscauawoznsblwjpcretc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805217.9508529-1120-114710167739052/AnsiballZ_copy.py'
Nov 22 09:53:38 compute-0 sudo[154116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:39 compute-0 python3.9[154118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805217.9508529-1120-114710167739052/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:39 compute-0 sudo[154116]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:39 compute-0 sudo[154268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxtrchkdaqxestakvtvgozrnqvvybqhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805219.4561632-1136-238563077777773/AnsiballZ_file.py'
Nov 22 09:53:39 compute-0 sudo[154268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:39 compute-0 python3.9[154270]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:39 compute-0 sudo[154268]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:40 compute-0 sudo[154420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnqcypsqbhfjamjtlysioqrhtiedwbbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805220.1690373-1144-54356536703716/AnsiballZ_stat.py'
Nov 22 09:53:40 compute-0 sudo[154420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:40 compute-0 python3.9[154422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:40 compute-0 sudo[154420]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:41 compute-0 sudo[154498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfyyqprwynorwrenqkqqtkpfxoozdnlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805220.1690373-1144-54356536703716/AnsiballZ_file.py'
Nov 22 09:53:41 compute-0 sudo[154498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:41 compute-0 python3.9[154500]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:41 compute-0 sudo[154498]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:41 compute-0 sudo[154650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mylbevarqbzjmykqceikpsfdyghulvav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805221.5332682-1156-112946249725729/AnsiballZ_stat.py'
Nov 22 09:53:41 compute-0 sudo[154650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:42 compute-0 python3.9[154652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:42 compute-0 sudo[154650]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:42 compute-0 sudo[154728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toerexclzcsqahuhyoxrdfbwbisqbgld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805221.5332682-1156-112946249725729/AnsiballZ_file.py'
Nov 22 09:53:42 compute-0 sudo[154728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:42 compute-0 python3.9[154730]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.iarpqc88 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:42 compute-0 sudo[154728]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:43 compute-0 sudo[154880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmbimcdaxmzuazcwosmbplmwmraillvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805222.9809153-1168-223924529506706/AnsiballZ_stat.py'
Nov 22 09:53:43 compute-0 sudo[154880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:43 compute-0 python3.9[154882]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:43 compute-0 sudo[154880]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:43 compute-0 sudo[154958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zufzkfxukgyjhmvykjpdgolvefzwdwux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805222.9809153-1168-223924529506706/AnsiballZ_file.py'
Nov 22 09:53:43 compute-0 sudo[154958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:44 compute-0 python3.9[154960]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:44 compute-0 sudo[154958]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:44 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 22 09:53:44 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 22 09:53:44 compute-0 sudo[155129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zueqcugdnicqhwlvckbhawcgqidekynh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805224.3000898-1181-240441461059243/AnsiballZ_command.py'
Nov 22 09:53:44 compute-0 sudo[155129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:44 compute-0 podman[155061]: 2025-11-22 09:53:44.642299091 +0000 UTC m=+0.093572960 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:53:44 compute-0 python3.9[155136]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:53:44 compute-0 sudo[155129]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:45 compute-0 sudo[155287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bifnegjwcdzhrqnbwqgpzdcgrruqlnax ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805225.092884-1189-260815230301228/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 09:53:45 compute-0 sudo[155287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:45 compute-0 python3[155289]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 09:53:45 compute-0 sudo[155287]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:46 compute-0 sudo[155439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyenurohkbcqmuaiepxxviajjwtjfnfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805226.0814276-1197-219018423100763/AnsiballZ_stat.py'
Nov 22 09:53:46 compute-0 sudo[155439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:46 compute-0 python3.9[155441]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:46 compute-0 sudo[155439]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:46 compute-0 sudo[155517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itfeaqsijdwflskkbhufaoysiwxhwwrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805226.0814276-1197-219018423100763/AnsiballZ_file.py'
Nov 22 09:53:46 compute-0 sudo[155517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:47 compute-0 python3.9[155519]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:47 compute-0 sudo[155517]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:47 compute-0 sudo[155669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvtwdsprlohflfaqmyzcaaodowyrywmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805227.3347633-1209-135534315273954/AnsiballZ_stat.py'
Nov 22 09:53:47 compute-0 sudo[155669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:47 compute-0 python3.9[155671]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:47 compute-0 sudo[155669]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:48 compute-0 sudo[155747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hytdezcrktpxamzspjeubiphbjsrtbgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805227.3347633-1209-135534315273954/AnsiballZ_file.py'
Nov 22 09:53:48 compute-0 sudo[155747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:48 compute-0 python3.9[155749]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:48 compute-0 sudo[155747]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:48 compute-0 sudo[155899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbtnvhbbmvgsniciemiopofyelcfaecr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805228.5418904-1221-195108803544054/AnsiballZ_stat.py'
Nov 22 09:53:48 compute-0 sudo[155899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:49 compute-0 python3.9[155901]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:49 compute-0 sudo[155899]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:49 compute-0 sudo[155977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvcwsudvbpcsncjikiqfeqjgkkudiuxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805228.5418904-1221-195108803544054/AnsiballZ_file.py'
Nov 22 09:53:49 compute-0 sudo[155977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:49 compute-0 python3.9[155979]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:49 compute-0 sudo[155977]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:49 compute-0 sudo[156145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkyxokcmoznavnbivijwynkupxqfrbeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805229.6981378-1233-48489854889182/AnsiballZ_stat.py'
Nov 22 09:53:50 compute-0 sudo[156145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:50 compute-0 podman[156103]: 2025-11-22 09:53:50.034385976 +0000 UTC m=+0.087902256 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 09:53:50 compute-0 python3.9[156151]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:50 compute-0 sudo[156145]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:50 compute-0 sudo[156227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyrnrcuyrtezilodvpefgvqyvptkxksn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805229.6981378-1233-48489854889182/AnsiballZ_file.py'
Nov 22 09:53:50 compute-0 sudo[156227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:50 compute-0 python3.9[156229]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:50 compute-0 sudo[156227]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:51 compute-0 sudo[156379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaqbktjqugclpxndadazcztlztvdugcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805230.7808268-1245-152723018753313/AnsiballZ_stat.py'
Nov 22 09:53:51 compute-0 sudo[156379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:51 compute-0 python3.9[156381]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:51 compute-0 sudo[156379]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:51 compute-0 sudo[156504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gldkxskowfgwdsxkxuuprxpniagtcjrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805230.7808268-1245-152723018753313/AnsiballZ_copy.py'
Nov 22 09:53:51 compute-0 sudo[156504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:52 compute-0 python3.9[156506]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805230.7808268-1245-152723018753313/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:52 compute-0 sudo[156504]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:52 compute-0 sudo[156656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qguvvaczhlvgmwdybkcgwkuxcsqsgajy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805232.2898622-1260-275180090321791/AnsiballZ_file.py'
Nov 22 09:53:52 compute-0 sudo[156656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:52 compute-0 python3.9[156658]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:52 compute-0 sudo[156656]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:53 compute-0 sudo[156808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqbboovvaarebqnnwinvirsgjhdzduju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805232.9544396-1268-247535968754792/AnsiballZ_command.py'
Nov 22 09:53:53 compute-0 sudo[156808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:53 compute-0 python3.9[156810]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:53:53 compute-0 sudo[156808]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:54 compute-0 sudo[156963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wycmruggrzjphhogtaxvafmcbyohfoih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805233.8114066-1276-8514995463538/AnsiballZ_blockinfile.py'
Nov 22 09:53:54 compute-0 sudo[156963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:54 compute-0 python3.9[156965]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:54 compute-0 sudo[156963]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:55 compute-0 sudo[157115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymlvdcovwnskvqdeykupuahdiiezykbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805234.7991416-1285-56268874400198/AnsiballZ_command.py'
Nov 22 09:53:55 compute-0 sudo[157115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:55 compute-0 python3.9[157117]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:53:55 compute-0 sudo[157115]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:55 compute-0 sudo[157268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlyfctcwpifiimoaboabzwocyjnmhkpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805235.5132208-1293-279950829755396/AnsiballZ_stat.py'
Nov 22 09:53:55 compute-0 sudo[157268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:56 compute-0 python3.9[157270]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:53:56 compute-0 sudo[157268]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:56 compute-0 sudo[157422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olkbpdlgufdrjvizjzkqdzykufqnrzir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805236.3588216-1301-39563022213400/AnsiballZ_command.py'
Nov 22 09:53:56 compute-0 sudo[157422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:56 compute-0 python3.9[157424]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:53:56 compute-0 sudo[157422]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:57 compute-0 sudo[157577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guepqqiknfxjwmgdrkzcdnexjijpgapo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805237.088018-1309-256187895781272/AnsiballZ_file.py'
Nov 22 09:53:57 compute-0 sudo[157577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:57 compute-0 python3.9[157579]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:57 compute-0 sudo[157577]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:58 compute-0 sudo[157729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbblbbxqvqzszeickdilkrfhqezsexaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805237.7904468-1317-233871921733248/AnsiballZ_stat.py'
Nov 22 09:53:58 compute-0 sudo[157729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:58 compute-0 python3.9[157731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:58 compute-0 sudo[157729]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:58 compute-0 sudo[157852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynbebcnmeptyikklvpamdryefublwtow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805237.7904468-1317-233871921733248/AnsiballZ_copy.py'
Nov 22 09:53:58 compute-0 sudo[157852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:58 compute-0 python3.9[157854]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805237.7904468-1317-233871921733248/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:53:58 compute-0 sudo[157852]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:59 compute-0 sudo[158004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yikjijpnrewfqavtbwtgipvyovbskelb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805238.992756-1332-129791021564683/AnsiballZ_stat.py'
Nov 22 09:53:59 compute-0 sudo[158004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:53:59 compute-0 python3.9[158006]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:53:59 compute-0 sudo[158004]: pam_unix(sudo:session): session closed for user root
Nov 22 09:53:59 compute-0 sudo[158127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvwrornaqhaecpgoyimkiwjokjpzvvsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805238.992756-1332-129791021564683/AnsiballZ_copy.py'
Nov 22 09:53:59 compute-0 sudo[158127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:00 compute-0 python3.9[158129]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805238.992756-1332-129791021564683/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:00 compute-0 sudo[158127]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:00 compute-0 sudo[158279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjikalcacthheonjskqpylydnzcjkpma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805240.2599423-1347-2444832632891/AnsiballZ_stat.py'
Nov 22 09:54:00 compute-0 sudo[158279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:00 compute-0 python3.9[158281]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:54:00 compute-0 sudo[158279]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:01 compute-0 sudo[158402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkjvgotyxhztlwrdtguczzozppsuztse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805240.2599423-1347-2444832632891/AnsiballZ_copy.py'
Nov 22 09:54:01 compute-0 sudo[158402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:01 compute-0 python3.9[158404]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805240.2599423-1347-2444832632891/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:01 compute-0 sudo[158402]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:02 compute-0 sudo[158554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwmfkcxqvzuypwfockapyiybtmyntmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805241.6983147-1362-117389209618267/AnsiballZ_systemd.py'
Nov 22 09:54:02 compute-0 sudo[158554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:02 compute-0 python3.9[158556]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:54:02 compute-0 systemd[1]: Reloading.
Nov 22 09:54:02 compute-0 systemd-rc-local-generator[158583]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:54:02 compute-0 systemd-sysv-generator[158587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:54:02 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 22 09:54:02 compute-0 sudo[158554]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:03 compute-0 sudo[158744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frlfpjxvtwdbtqlttbugxljhinvxbqck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805242.965728-1370-167167024915892/AnsiballZ_systemd.py'
Nov 22 09:54:03 compute-0 sudo[158744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:03 compute-0 python3.9[158746]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 09:54:03 compute-0 systemd[1]: Reloading.
Nov 22 09:54:03 compute-0 systemd-rc-local-generator[158774]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:54:03 compute-0 systemd-sysv-generator[158777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:54:03 compute-0 systemd[1]: Reloading.
Nov 22 09:54:04 compute-0 systemd-rc-local-generator[158813]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:54:04 compute-0 systemd-sysv-generator[158817]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:54:04 compute-0 sudo[158744]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:04 compute-0 sshd-session[104337]: Connection closed by 192.168.122.30 port 35646
Nov 22 09:54:04 compute-0 sshd-session[104334]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:54:04 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 22 09:54:04 compute-0 systemd[1]: session-22.scope: Consumed 3min 38.480s CPU time.
Nov 22 09:54:04 compute-0 systemd-logind[819]: Session 22 logged out. Waiting for processes to exit.
Nov 22 09:54:04 compute-0 systemd-logind[819]: Removed session 22.
Nov 22 09:54:11 compute-0 sshd-session[158844]: Accepted publickey for zuul from 192.168.122.30 port 44782 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:54:11 compute-0 systemd-logind[819]: New session 23 of user zuul.
Nov 22 09:54:11 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 22 09:54:11 compute-0 sshd-session[158844]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:54:12 compute-0 python3.9[158997]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:54:13 compute-0 python3.9[159151]: ansible-ansible.builtin.service_facts Invoked
Nov 22 09:54:13 compute-0 network[159168]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 09:54:13 compute-0 network[159169]: 'network-scripts' will be removed from distribution in near future.
Nov 22 09:54:13 compute-0 network[159170]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 09:54:14 compute-0 podman[159186]: 2025-11-22 09:54:14.790042037 +0000 UTC m=+0.091633712 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 09:54:17 compute-0 sudo[159464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbartonimhvpozdwxzliryhlbzhutncr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805257.2014008-47-211474249778726/AnsiballZ_setup.py'
Nov 22 09:54:17 compute-0 sudo[159464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:17 compute-0 python3.9[159466]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 09:54:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:54:17.916 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:54:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:54:17.918 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:54:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:54:17.918 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:54:18 compute-0 sudo[159464]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:18 compute-0 sudo[159548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vshpglqijqbngwibgfknqwhmcvrptnen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805257.2014008-47-211474249778726/AnsiballZ_dnf.py'
Nov 22 09:54:18 compute-0 sudo[159548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:18 compute-0 python3.9[159550]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:54:20 compute-0 podman[159552]: 2025-11-22 09:54:20.611054621 +0000 UTC m=+0.066285227 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:54:23 compute-0 sudo[159548]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:24 compute-0 sudo[159721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjbjngcbfhnmkbqfestlodzakqbnaend ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805264.1674116-59-72916975873753/AnsiballZ_stat.py'
Nov 22 09:54:24 compute-0 sudo[159721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:24 compute-0 python3.9[159723]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:54:24 compute-0 sudo[159721]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:25 compute-0 sudo[159873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocwezouodufqtdxvscgrapfxdhgjxzhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805265.1646008-69-160349973921497/AnsiballZ_command.py'
Nov 22 09:54:25 compute-0 sudo[159873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:25 compute-0 python3.9[159875]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:54:25 compute-0 sudo[159873]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:26 compute-0 sudo[160026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mewsurlaphlvhwaguulrbprwdggtsnqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805266.2094576-79-127915659874346/AnsiballZ_stat.py'
Nov 22 09:54:26 compute-0 sudo[160026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:26 compute-0 python3.9[160028]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:54:26 compute-0 sudo[160026]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:27 compute-0 sudo[160178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjtekitaezpkkfbwyixoflgflfqexywu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805267.000928-87-172783766649420/AnsiballZ_command.py'
Nov 22 09:54:27 compute-0 sudo[160178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:27 compute-0 python3.9[160180]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:54:27 compute-0 sudo[160178]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:28 compute-0 sudo[160331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrvhqmpzgfyjtceoqvlxrsmcyscxgbkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805267.7408564-95-8169172693471/AnsiballZ_stat.py'
Nov 22 09:54:28 compute-0 sudo[160331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:28 compute-0 python3.9[160333]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:54:28 compute-0 sudo[160331]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:28 compute-0 sudo[160454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sovyydhylrmdotvzzstxbsbknffwokqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805267.7408564-95-8169172693471/AnsiballZ_copy.py'
Nov 22 09:54:28 compute-0 sudo[160454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:28 compute-0 python3.9[160456]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805267.7408564-95-8169172693471/.source.iscsi _original_basename=.inhgnjss follow=False checksum=8b51b77ed65157a617f891436237689227cd542b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:29 compute-0 sudo[160454]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:29 compute-0 sudo[160606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqrkttvbyfqkjihqxizcplrniqktpyjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805269.2221248-110-117689337671665/AnsiballZ_file.py'
Nov 22 09:54:29 compute-0 sudo[160606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:29 compute-0 python3.9[160608]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:29 compute-0 sudo[160606]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:30 compute-0 sudo[160758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyenpfuwedsgkpuuqklqzchkozwnzskm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805270.1054149-118-180231859245215/AnsiballZ_lineinfile.py'
Nov 22 09:54:30 compute-0 sudo[160758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:30 compute-0 python3.9[160760]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:30 compute-0 sudo[160758]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:30 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:54:30 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:54:31 compute-0 sudo[160911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyivfesgjnxzfaruqcakohhxiyrhrlgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805271.2013192-127-185520836330769/AnsiballZ_systemd_service.py'
Nov 22 09:54:31 compute-0 sudo[160911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:32 compute-0 python3.9[160913]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:54:32 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 22 09:54:32 compute-0 sudo[160911]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:32 compute-0 sudo[161067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sisadswdfuvwhmjxnkdyeeeqjilnvjtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805272.4259794-135-210186271367270/AnsiballZ_systemd_service.py'
Nov 22 09:54:32 compute-0 sudo[161067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:33 compute-0 python3.9[161069]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:54:33 compute-0 systemd[1]: Reloading.
Nov 22 09:54:33 compute-0 systemd-rc-local-generator[161099]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:54:33 compute-0 systemd-sysv-generator[161103]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:54:33 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 09:54:33 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 22 09:54:33 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 22 09:54:33 compute-0 systemd[1]: Started Open-iSCSI.
Nov 22 09:54:33 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 22 09:54:33 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 22 09:54:33 compute-0 sudo[161067]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:34 compute-0 sudo[161268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tithlnzdkwcghwhmgpkyvcpeqlulddxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805273.9358056-146-54954845099015/AnsiballZ_service_facts.py'
Nov 22 09:54:34 compute-0 sudo[161268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:34 compute-0 python3.9[161270]: ansible-ansible.builtin.service_facts Invoked
Nov 22 09:54:34 compute-0 network[161287]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 09:54:34 compute-0 network[161288]: 'network-scripts' will be removed from distribution in near future.
Nov 22 09:54:34 compute-0 network[161289]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 09:54:39 compute-0 sudo[161268]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:40 compute-0 sudo[161558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shtgyusxdiqlhrzvzmdyxfnpovojtlxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805279.841241-156-55135450795917/AnsiballZ_file.py'
Nov 22 09:54:40 compute-0 sudo[161558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:40 compute-0 python3.9[161560]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 09:54:40 compute-0 sudo[161558]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:40 compute-0 sudo[161710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpnquhehatebqtthyxrnpzcgrkjdhpys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805280.5419772-164-109033198643591/AnsiballZ_modprobe.py'
Nov 22 09:54:40 compute-0 sudo[161710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:41 compute-0 python3.9[161712]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 22 09:54:41 compute-0 sudo[161710]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:41 compute-0 sudo[161866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oewateeerghgfsgpunejdwdkitnkvwga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805281.3999078-172-168101480964574/AnsiballZ_stat.py'
Nov 22 09:54:41 compute-0 sudo[161866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:42 compute-0 python3.9[161868]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:54:42 compute-0 sudo[161866]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:42 compute-0 sudo[161989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eontpcsshwgptjzbttxeqgmntdduxlic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805281.3999078-172-168101480964574/AnsiballZ_copy.py'
Nov 22 09:54:42 compute-0 sudo[161989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:42 compute-0 python3.9[161991]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805281.3999078-172-168101480964574/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:42 compute-0 sudo[161989]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:43 compute-0 sudo[162141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsyzskbkofkqpaunzdvjehohzzifwrps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805283.0011747-188-258959446588749/AnsiballZ_lineinfile.py'
Nov 22 09:54:43 compute-0 sudo[162141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:43 compute-0 python3.9[162143]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:43 compute-0 sudo[162141]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:44 compute-0 sudo[162293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mchlfffxfdulkuwkegayignhhblwgfml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805283.8337471-196-274172650339753/AnsiballZ_systemd.py'
Nov 22 09:54:44 compute-0 sudo[162293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:44 compute-0 python3.9[162295]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:54:44 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 09:54:44 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 22 09:54:44 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 22 09:54:44 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 22 09:54:44 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 22 09:54:44 compute-0 sudo[162293]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:44 compute-0 podman[162297]: 2025-11-22 09:54:44.979931242 +0000 UTC m=+0.130963129 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:54:45 compute-0 sudo[162473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dalxcbhvnxwfpsnsnitdqzmurtxallry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805285.1666012-204-223594648390320/AnsiballZ_file.py'
Nov 22 09:54:45 compute-0 sudo[162473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:45 compute-0 python3.9[162475]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:54:45 compute-0 sudo[162473]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:46 compute-0 sudo[162625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeanjwvftncnyusbmwnjkuynisdwivsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805286.0502355-213-116667472654637/AnsiballZ_stat.py'
Nov 22 09:54:46 compute-0 sudo[162625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:46 compute-0 python3.9[162627]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:54:46 compute-0 sudo[162625]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:47 compute-0 sudo[162777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knakafjefcaxvmynivbswnujvjbtbhsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805286.8173885-222-167202024066421/AnsiballZ_stat.py'
Nov 22 09:54:47 compute-0 sudo[162777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:47 compute-0 python3.9[162779]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:54:47 compute-0 sudo[162777]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:47 compute-0 sudo[162929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cllhyumcolkwtsuenhixteifckkezoif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805287.5182042-230-159852466052427/AnsiballZ_stat.py'
Nov 22 09:54:47 compute-0 sudo[162929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:47 compute-0 python3.9[162931]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:54:48 compute-0 sudo[162929]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:48 compute-0 sudo[163052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxnhoqtoxmrlscufqnwtreonfdozvgci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805287.5182042-230-159852466052427/AnsiballZ_copy.py'
Nov 22 09:54:48 compute-0 sudo[163052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:48 compute-0 python3.9[163054]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805287.5182042-230-159852466052427/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:48 compute-0 sudo[163052]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:49 compute-0 sudo[163204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpxrhczfohtlwsuhkuszgyatkpxzyrjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805288.7366648-245-59951163834385/AnsiballZ_command.py'
Nov 22 09:54:49 compute-0 sudo[163204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:49 compute-0 python3.9[163206]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:54:49 compute-0 sudo[163204]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:49 compute-0 sudo[163357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tekiqzycupxfpdtdxedekhhzdyphxzde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805289.510932-253-214430419652680/AnsiballZ_lineinfile.py'
Nov 22 09:54:49 compute-0 sudo[163357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:50 compute-0 python3.9[163359]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:50 compute-0 sudo[163357]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:50 compute-0 sudo[163509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjdcsudxwiiuitpiziwjayzizgsjlxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805290.2326791-261-111215567135693/AnsiballZ_replace.py'
Nov 22 09:54:50 compute-0 sudo[163509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:50 compute-0 podman[163511]: 2025-11-22 09:54:50.726358189 +0000 UTC m=+0.045344849 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 09:54:50 compute-0 python3.9[163512]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:50 compute-0 sudo[163509]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:51 compute-0 sudo[163681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aywkezlltwnnjfqwufjhwxfeaugevjil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805291.008562-269-153101803583940/AnsiballZ_replace.py'
Nov 22 09:54:51 compute-0 sudo[163681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:51 compute-0 python3.9[163683]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:51 compute-0 sudo[163681]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:52 compute-0 sudo[163833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thfnkjhgxhlrceykpsumfnsjhakvsrjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805291.7715492-278-178376110538718/AnsiballZ_lineinfile.py'
Nov 22 09:54:52 compute-0 sudo[163833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:52 compute-0 python3.9[163835]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:52 compute-0 sudo[163833]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:52 compute-0 sudo[163985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwwcyzisbniapuqnvtmvxyxhxyrtphhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805292.4952536-278-96102163112979/AnsiballZ_lineinfile.py'
Nov 22 09:54:52 compute-0 sudo[163985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:53 compute-0 python3.9[163987]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:53 compute-0 sudo[163985]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:53 compute-0 sudo[164137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vucptslmffbtuzcitnarkqqhjmaouqze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805293.188721-278-247618782105908/AnsiballZ_lineinfile.py'
Nov 22 09:54:53 compute-0 sudo[164137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:53 compute-0 python3.9[164139]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:53 compute-0 sudo[164137]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:54 compute-0 sudo[164289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqhindmwwtlbrjzmqszdchnceccjskni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805294.0956945-278-70058293774920/AnsiballZ_lineinfile.py'
Nov 22 09:54:54 compute-0 sudo[164289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:54 compute-0 python3.9[164291]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:54 compute-0 sudo[164289]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:55 compute-0 sudo[164441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ortrjnflykuoqjvdmamsetvvsbrukshf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805294.7804184-307-235792451877713/AnsiballZ_stat.py'
Nov 22 09:54:55 compute-0 sudo[164441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:55 compute-0 python3.9[164443]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:54:55 compute-0 sudo[164441]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:55 compute-0 sudo[164595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaocvwquvpwqbecwgoynrxkivioqhasc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805295.5004086-315-41058370205274/AnsiballZ_file.py'
Nov 22 09:54:55 compute-0 sudo[164595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:55 compute-0 python3.9[164597]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:55 compute-0 sudo[164595]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:56 compute-0 sudo[164747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqprtthyswokhovyjtvndvlkolbecvbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805296.2009602-324-141949806185549/AnsiballZ_file.py'
Nov 22 09:54:56 compute-0 sudo[164747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:56 compute-0 python3.9[164749]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:54:56 compute-0 sudo[164747]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:57 compute-0 sudo[164899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtjosmixwvzngteoxskxfsnnehsjpktn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805296.933121-332-193749145247970/AnsiballZ_stat.py'
Nov 22 09:54:57 compute-0 sudo[164899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:57 compute-0 python3.9[164901]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:54:57 compute-0 sudo[164899]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:57 compute-0 sudo[164977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfkngzmaxoledifulzrafezhdveenxkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805296.933121-332-193749145247970/AnsiballZ_file.py'
Nov 22 09:54:57 compute-0 sudo[164977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:58 compute-0 python3.9[164979]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:54:58 compute-0 sudo[164977]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:58 compute-0 sudo[165129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixcyuqqwwtjbkmwyshojkjhzxrwzsito ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805298.3056297-332-268088746467269/AnsiballZ_stat.py'
Nov 22 09:54:58 compute-0 sudo[165129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:58 compute-0 python3.9[165131]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:54:58 compute-0 sudo[165129]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:59 compute-0 sudo[165207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkbjcbyesmwptvkecrhekinkohzidtnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805298.3056297-332-268088746467269/AnsiballZ_file.py'
Nov 22 09:54:59 compute-0 sudo[165207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:59 compute-0 python3.9[165209]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:54:59 compute-0 sudo[165207]: pam_unix(sudo:session): session closed for user root
Nov 22 09:54:59 compute-0 sudo[165359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uawzsotqpxjxrvcivyjnawaatubechlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805299.499465-355-11575598085971/AnsiballZ_file.py'
Nov 22 09:54:59 compute-0 sudo[165359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:54:59 compute-0 python3.9[165361]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:54:59 compute-0 sudo[165359]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:00 compute-0 sudo[165511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqjplnrorqcifjsyfiojbkdxrbmkofos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805300.1410372-363-7765074051536/AnsiballZ_stat.py'
Nov 22 09:55:00 compute-0 sudo[165511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:00 compute-0 python3.9[165513]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:55:00 compute-0 sudo[165511]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:00 compute-0 sudo[165589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkfgbmtmblxuxctetpscdmyuasgllrxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805300.1410372-363-7765074051536/AnsiballZ_file.py'
Nov 22 09:55:00 compute-0 sudo[165589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:01 compute-0 python3.9[165591]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:01 compute-0 sudo[165589]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:01 compute-0 sudo[165741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrvccetyhjfvwmbnrkplxjhlbuorsnuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805301.2326097-375-230418968590142/AnsiballZ_stat.py'
Nov 22 09:55:01 compute-0 sudo[165741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:01 compute-0 python3.9[165743]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:55:01 compute-0 sudo[165741]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:02 compute-0 sudo[165819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmtpyrubzdletzfldfceulmbvsxaocrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805301.2326097-375-230418968590142/AnsiballZ_file.py'
Nov 22 09:55:02 compute-0 sudo[165819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:02 compute-0 python3.9[165821]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:02 compute-0 sudo[165819]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:02 compute-0 sudo[165971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkmodgzmgurbmzaecwmivlrluzayjuvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805302.5574706-387-223624758764961/AnsiballZ_systemd.py'
Nov 22 09:55:02 compute-0 sudo[165971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:03 compute-0 python3.9[165973]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:03 compute-0 systemd[1]: Reloading.
Nov 22 09:55:03 compute-0 systemd-rc-local-generator[165993]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:03 compute-0 systemd-sysv-generator[166001]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:03 compute-0 sudo[165971]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:04 compute-0 sudo[166160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzagoifhyazygfgnfnsqzrwuqspqpsvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805303.7983289-395-206537585195514/AnsiballZ_stat.py'
Nov 22 09:55:04 compute-0 sudo[166160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:04 compute-0 python3.9[166162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:55:04 compute-0 sudo[166160]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:04 compute-0 sudo[166238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deejpdcubmhhlvepfupnogfqqdohhaof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805303.7983289-395-206537585195514/AnsiballZ_file.py'
Nov 22 09:55:04 compute-0 sudo[166238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:04 compute-0 python3.9[166240]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:04 compute-0 sudo[166238]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:05 compute-0 sudo[166390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxsuexpdktrdoztikaepulocthosclwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805305.00161-407-13344232647709/AnsiballZ_stat.py'
Nov 22 09:55:05 compute-0 sudo[166390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:05 compute-0 python3.9[166392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:55:05 compute-0 sudo[166390]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:05 compute-0 sudo[166468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnifnvpiagmszflejxxxkkogjdwtdtdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805305.00161-407-13344232647709/AnsiballZ_file.py'
Nov 22 09:55:05 compute-0 sudo[166468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:06 compute-0 python3.9[166470]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:06 compute-0 sudo[166468]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:06 compute-0 sudo[166620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gquusjkyuiksnnmejupdoxgmnyxagfdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805306.2344487-419-8803305101094/AnsiballZ_systemd.py'
Nov 22 09:55:06 compute-0 sudo[166620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:06 compute-0 python3.9[166622]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:06 compute-0 systemd[1]: Reloading.
Nov 22 09:55:06 compute-0 systemd-rc-local-generator[166647]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:06 compute-0 systemd-sysv-generator[166652]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:07 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 09:55:07 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 09:55:07 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 09:55:07 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 09:55:07 compute-0 sudo[166620]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:07 compute-0 sudo[166814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gszezyqjjrhiqtwwssugoxvsbdbujaqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805307.5525897-429-40017588960119/AnsiballZ_file.py'
Nov 22 09:55:07 compute-0 sudo[166814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:08 compute-0 python3.9[166816]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:55:08 compute-0 sudo[166814]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:08 compute-0 sudo[166966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imgaecfmilaxcbvbxmytpvbloffuxblj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805308.2907863-437-36107044093548/AnsiballZ_stat.py'
Nov 22 09:55:08 compute-0 sudo[166966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:08 compute-0 python3.9[166968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:55:08 compute-0 sudo[166966]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:09 compute-0 sudo[167089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvpbzdlmkccdquwwifkfmkzqowapxpol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805308.2907863-437-36107044093548/AnsiballZ_copy.py'
Nov 22 09:55:09 compute-0 sudo[167089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:09 compute-0 python3.9[167091]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805308.2907863-437-36107044093548/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:55:09 compute-0 sudo[167089]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:10 compute-0 sudo[167241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxbdsrtsmptjsqxpocgjehxjbzjuwlew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805309.9429035-454-7491122145540/AnsiballZ_file.py'
Nov 22 09:55:10 compute-0 sudo[167241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:10 compute-0 python3.9[167243]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:55:10 compute-0 sudo[167241]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:10 compute-0 sudo[167393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxlzncomjjlynkpmbpkahgakmsecdzzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805310.6418476-462-272671476830317/AnsiballZ_stat.py'
Nov 22 09:55:10 compute-0 sudo[167393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:11 compute-0 python3.9[167395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:55:11 compute-0 sudo[167393]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:11 compute-0 sudo[167516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hksgzjvrvpddbyrodbyeslxbdxxluafz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805310.6418476-462-272671476830317/AnsiballZ_copy.py'
Nov 22 09:55:11 compute-0 sudo[167516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:11 compute-0 python3.9[167518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805310.6418476-462-272671476830317/.source.json _original_basename=.418f32cu follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:11 compute-0 sudo[167516]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:12 compute-0 sudo[167668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gabzgxmrwronvfjvopxoqfyhvgxknhks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805312.1069803-477-41372485209282/AnsiballZ_file.py'
Nov 22 09:55:12 compute-0 sudo[167668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:12 compute-0 python3.9[167670]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:12 compute-0 sudo[167668]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:13 compute-0 sudo[167820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsvscfhefzdwlmhcfvujatkbcpnwhrxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805312.7947066-485-198689405215666/AnsiballZ_stat.py'
Nov 22 09:55:13 compute-0 sudo[167820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:13 compute-0 sudo[167820]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:13 compute-0 sudo[167943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wejbeqddhqizlqltnpmphzilxophhptq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805312.7947066-485-198689405215666/AnsiballZ_copy.py'
Nov 22 09:55:13 compute-0 sudo[167943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:13 compute-0 sudo[167943]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:14 compute-0 sudo[168095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcziqnhwdkbqtvxpbyvdtfinqgeehytc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805314.0708406-502-90790359006490/AnsiballZ_container_config_data.py'
Nov 22 09:55:14 compute-0 sudo[168095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:14 compute-0 python3.9[168097]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 22 09:55:14 compute-0 sudo[168095]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:15 compute-0 sudo[168257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dibolgwonyhftgxcvsbhrmhjjygvqhcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805314.905136-511-140872322125689/AnsiballZ_container_config_hash.py'
Nov 22 09:55:15 compute-0 sudo[168257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:15 compute-0 podman[168221]: 2025-11-22 09:55:15.360604355 +0000 UTC m=+0.081609182 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:55:15 compute-0 python3.9[168266]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:55:15 compute-0 sudo[168257]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:16 compute-0 sudo[168423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blgqudrvfgoaiyblxxqcwmmsxnhfeopx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805315.8074965-520-13614488963659/AnsiballZ_podman_container_info.py'
Nov 22 09:55:16 compute-0 sudo[168423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:16 compute-0 python3.9[168425]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 09:55:16 compute-0 sudo[168423]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:17 compute-0 sudo[168602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csrrcepclndxhvlsramqtnndvvnruwmf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805317.072132-533-95568870132037/AnsiballZ_edpm_container_manage.py'
Nov 22 09:55:17 compute-0 sudo[168602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:17 compute-0 python3[168604]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:55:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:55:17.917 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:55:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:55:17.918 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:55:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:55:17.918 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:55:18 compute-0 podman[168640]: 2025-11-22 09:55:18.042897348 +0000 UTC m=+0.089843675 container create a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 09:55:18 compute-0 podman[168640]: 2025-11-22 09:55:17.9721581 +0000 UTC m=+0.019104417 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 09:55:18 compute-0 python3[168604]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 09:55:18 compute-0 sudo[168602]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:18 compute-0 sudo[168828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwhskvsdsxzlwfhdjkeeobragexyahpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805318.3313184-541-152866821559701/AnsiballZ_stat.py'
Nov 22 09:55:18 compute-0 sudo[168828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:18 compute-0 python3.9[168830]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:55:18 compute-0 sudo[168828]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:19 compute-0 sudo[168982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpwqurejrowbzdqymwwoelnwaunfxkej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805319.042035-550-248596639886462/AnsiballZ_file.py'
Nov 22 09:55:19 compute-0 sudo[168982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:19 compute-0 python3.9[168984]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:19 compute-0 sudo[168982]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:19 compute-0 sudo[169058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbjjeaxnpxacshmrhgexhssolzgbcwox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805319.042035-550-248596639886462/AnsiballZ_stat.py'
Nov 22 09:55:19 compute-0 sudo[169058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:19 compute-0 python3.9[169060]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:55:20 compute-0 sudo[169058]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:20 compute-0 sudo[169209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggfcqyjztsdkuihzfhiervpbcvibklae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805320.0777905-550-177034749461141/AnsiballZ_copy.py'
Nov 22 09:55:20 compute-0 sudo[169209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:20 compute-0 python3.9[169211]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763805320.0777905-550-177034749461141/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:20 compute-0 sudo[169209]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:21 compute-0 sudo[169296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vutxnrszcsjwjubjeywfjvygbwrgubzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805320.0777905-550-177034749461141/AnsiballZ_systemd.py'
Nov 22 09:55:21 compute-0 sudo[169296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:21 compute-0 podman[169259]: 2025-11-22 09:55:21.121752958 +0000 UTC m=+0.082364902 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:55:21 compute-0 python3.9[169302]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:55:21 compute-0 systemd[1]: Reloading.
Nov 22 09:55:21 compute-0 systemd-rc-local-generator[169332]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:21 compute-0 systemd-sysv-generator[169335]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:21 compute-0 sudo[169296]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:22 compute-0 sudo[169414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhsbknvymitrwaciltzrdgjfspmlveps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805320.0777905-550-177034749461141/AnsiballZ_systemd.py'
Nov 22 09:55:22 compute-0 sudo[169414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:22 compute-0 python3.9[169416]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:22 compute-0 systemd[1]: Reloading.
Nov 22 09:55:22 compute-0 systemd-rc-local-generator[169441]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:22 compute-0 systemd-sysv-generator[169446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:22 compute-0 systemd[1]: Starting multipathd container...
Nov 22 09:55:22 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:55:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/831a668ce47cb9af29d4f3c2b2623101934f5c7e323844543b68d7477d329a6c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 09:55:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/831a668ce47cb9af29d4f3c2b2623101934f5c7e323844543b68d7477d329a6c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 09:55:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.
Nov 22 09:55:23 compute-0 podman[169455]: 2025-11-22 09:55:23.191773125 +0000 UTC m=+0.346442955 container init a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 09:55:23 compute-0 multipathd[169471]: + sudo -E kolla_set_configs
Nov 22 09:55:23 compute-0 podman[169455]: 2025-11-22 09:55:23.223928593 +0000 UTC m=+0.378598363 container start a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 09:55:23 compute-0 sudo[169477]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 09:55:23 compute-0 sudo[169477]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:55:23 compute-0 sudo[169477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 09:55:23 compute-0 podman[169455]: multipathd
Nov 22 09:55:23 compute-0 systemd[1]: Started multipathd container.
Nov 22 09:55:23 compute-0 sudo[169414]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:23 compute-0 podman[169478]: 2025-11-22 09:55:23.299901302 +0000 UTC m=+0.063587857 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 09:55:23 compute-0 systemd[1]: a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323-7401c3220f91d52c.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 09:55:23 compute-0 systemd[1]: a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323-7401c3220f91d52c.service: Failed with result 'exit-code'.
Nov 22 09:55:23 compute-0 multipathd[169471]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:55:23 compute-0 multipathd[169471]: INFO:__main__:Validating config file
Nov 22 09:55:23 compute-0 multipathd[169471]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:55:23 compute-0 multipathd[169471]: INFO:__main__:Writing out command to execute
Nov 22 09:55:23 compute-0 sudo[169477]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:23 compute-0 multipathd[169471]: ++ cat /run_command
Nov 22 09:55:23 compute-0 multipathd[169471]: + CMD='/usr/sbin/multipathd -d'
Nov 22 09:55:23 compute-0 multipathd[169471]: + ARGS=
Nov 22 09:55:23 compute-0 multipathd[169471]: + sudo kolla_copy_cacerts
Nov 22 09:55:23 compute-0 sudo[169509]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 09:55:23 compute-0 sudo[169509]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:55:23 compute-0 sudo[169509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 09:55:23 compute-0 sudo[169509]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:23 compute-0 multipathd[169471]: + [[ ! -n '' ]]
Nov 22 09:55:23 compute-0 multipathd[169471]: + . kolla_extend_start
Nov 22 09:55:23 compute-0 multipathd[169471]: Running command: '/usr/sbin/multipathd -d'
Nov 22 09:55:23 compute-0 multipathd[169471]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 09:55:23 compute-0 multipathd[169471]: + umask 0022
Nov 22 09:55:23 compute-0 multipathd[169471]: + exec /usr/sbin/multipathd -d
Nov 22 09:55:23 compute-0 multipathd[169471]: 2781.128601 | --------start up--------
Nov 22 09:55:23 compute-0 multipathd[169471]: 2781.128622 | read /etc/multipath.conf
Nov 22 09:55:23 compute-0 multipathd[169471]: 2781.134049 | path checkers start up
Nov 22 09:55:23 compute-0 python3.9[169661]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:55:24 compute-0 sudo[169813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pawtlhcoacyxnakatqrdifhirjhavsop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805324.2256835-586-72940427697747/AnsiballZ_command.py'
Nov 22 09:55:24 compute-0 sudo[169813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:24 compute-0 python3.9[169815]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:55:24 compute-0 sudo[169813]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:25 compute-0 sudo[169979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dotzsyudwgsufvlhndlibomghyxlatrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805324.9475274-594-142923640082860/AnsiballZ_systemd.py'
Nov 22 09:55:25 compute-0 sudo[169979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:25 compute-0 python3.9[169981]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:55:25 compute-0 systemd[1]: Stopping multipathd container...
Nov 22 09:55:25 compute-0 multipathd[169471]: 2783.394174 | exit (signal)
Nov 22 09:55:25 compute-0 multipathd[169471]: 2783.394252 | --------shut down-------
Nov 22 09:55:25 compute-0 systemd[1]: libpod-a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.scope: Deactivated successfully.
Nov 22 09:55:25 compute-0 podman[169985]: 2025-11-22 09:55:25.655717938 +0000 UTC m=+0.075244890 container died a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 09:55:25 compute-0 systemd[1]: a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323-7401c3220f91d52c.timer: Deactivated successfully.
Nov 22 09:55:25 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.
Nov 22 09:55:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323-userdata-shm.mount: Deactivated successfully.
Nov 22 09:55:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-831a668ce47cb9af29d4f3c2b2623101934f5c7e323844543b68d7477d329a6c-merged.mount: Deactivated successfully.
Nov 22 09:55:25 compute-0 podman[169985]: 2025-11-22 09:55:25.704116104 +0000 UTC m=+0.123643046 container cleanup a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 09:55:25 compute-0 podman[169985]: multipathd
Nov 22 09:55:25 compute-0 podman[170015]: multipathd
Nov 22 09:55:25 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 22 09:55:25 compute-0 systemd[1]: Stopped multipathd container.
Nov 22 09:55:25 compute-0 systemd[1]: Starting multipathd container...
Nov 22 09:55:25 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:55:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/831a668ce47cb9af29d4f3c2b2623101934f5c7e323844543b68d7477d329a6c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 09:55:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/831a668ce47cb9af29d4f3c2b2623101934f5c7e323844543b68d7477d329a6c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 09:55:25 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.
Nov 22 09:55:25 compute-0 podman[170028]: 2025-11-22 09:55:25.900790429 +0000 UTC m=+0.102576677 container init a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 22 09:55:25 compute-0 multipathd[170043]: + sudo -E kolla_set_configs
Nov 22 09:55:25 compute-0 podman[170028]: 2025-11-22 09:55:25.923013309 +0000 UTC m=+0.124799537 container start a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 09:55:25 compute-0 sudo[170049]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 09:55:25 compute-0 sudo[170049]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:55:25 compute-0 sudo[170049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 09:55:25 compute-0 podman[170028]: multipathd
Nov 22 09:55:25 compute-0 systemd[1]: Started multipathd container.
Nov 22 09:55:25 compute-0 sudo[169979]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:25 compute-0 multipathd[170043]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:55:25 compute-0 multipathd[170043]: INFO:__main__:Validating config file
Nov 22 09:55:25 compute-0 multipathd[170043]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:55:25 compute-0 multipathd[170043]: INFO:__main__:Writing out command to execute
Nov 22 09:55:25 compute-0 sudo[170049]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:25 compute-0 multipathd[170043]: ++ cat /run_command
Nov 22 09:55:25 compute-0 multipathd[170043]: + CMD='/usr/sbin/multipathd -d'
Nov 22 09:55:25 compute-0 multipathd[170043]: + ARGS=
Nov 22 09:55:25 compute-0 multipathd[170043]: + sudo kolla_copy_cacerts
Nov 22 09:55:25 compute-0 sudo[170069]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 09:55:25 compute-0 podman[170050]: 2025-11-22 09:55:25.986016979 +0000 UTC m=+0.051875031 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 09:55:25 compute-0 sudo[170069]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:55:25 compute-0 sudo[170069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 09:55:25 compute-0 sudo[170069]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:25 compute-0 systemd[1]: a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323-334026e42951a211.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 09:55:25 compute-0 multipathd[170043]: + [[ ! -n '' ]]
Nov 22 09:55:25 compute-0 multipathd[170043]: + . kolla_extend_start
Nov 22 09:55:25 compute-0 multipathd[170043]: Running command: '/usr/sbin/multipathd -d'
Nov 22 09:55:25 compute-0 multipathd[170043]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 09:55:25 compute-0 multipathd[170043]: + umask 0022
Nov 22 09:55:25 compute-0 multipathd[170043]: + exec /usr/sbin/multipathd -d
Nov 22 09:55:25 compute-0 systemd[1]: a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323-334026e42951a211.service: Failed with result 'exit-code'.
Nov 22 09:55:26 compute-0 multipathd[170043]: 2783.774543 | --------start up--------
Nov 22 09:55:26 compute-0 multipathd[170043]: 2783.774561 | read /etc/multipath.conf
Nov 22 09:55:26 compute-0 multipathd[170043]: 2783.780217 | path checkers start up
Nov 22 09:55:26 compute-0 sudo[170231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeswqtseehiqwadsnjhrtybllfmhsgkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805326.2682645-602-233660190888223/AnsiballZ_file.py'
Nov 22 09:55:26 compute-0 sudo[170231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:26 compute-0 python3.9[170233]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:26 compute-0 sudo[170231]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:27 compute-0 sudo[170383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfdmxrnvqekglmbocrgcrattqlddrfqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805327.1848848-614-35474097037968/AnsiballZ_file.py'
Nov 22 09:55:27 compute-0 sudo[170383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:27 compute-0 python3.9[170385]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 09:55:27 compute-0 sudo[170383]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:28 compute-0 sudo[170535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhgazcrwwdvhmqbwfolideojapjbgukd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805327.8726392-622-252590432996970/AnsiballZ_modprobe.py'
Nov 22 09:55:28 compute-0 sudo[170535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:28 compute-0 python3.9[170537]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 22 09:55:28 compute-0 kernel: Key type psk registered
Nov 22 09:55:28 compute-0 sudo[170535]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:28 compute-0 sudo[170698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crsluewhbedqgfgloormfgmfhscthlcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805328.582274-630-171914957131173/AnsiballZ_stat.py'
Nov 22 09:55:28 compute-0 sudo[170698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:29 compute-0 python3.9[170700]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:55:29 compute-0 sudo[170698]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:29 compute-0 sudo[170821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frumfpcutwaqngfhzullxuacdxxhwgnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805328.582274-630-171914957131173/AnsiballZ_copy.py'
Nov 22 09:55:29 compute-0 sudo[170821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:29 compute-0 python3.9[170823]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805328.582274-630-171914957131173/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:29 compute-0 sudo[170821]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:30 compute-0 sudo[170973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stpzelueapkbehxuycrzfiuqautaamiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805329.9479022-646-263617952366183/AnsiballZ_lineinfile.py'
Nov 22 09:55:30 compute-0 sudo[170973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:30 compute-0 python3.9[170975]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:30 compute-0 sudo[170973]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:30 compute-0 sudo[171125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syrdodxyijspghvstxdqbprkweczasrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805330.6083755-654-49976659946959/AnsiballZ_systemd.py'
Nov 22 09:55:30 compute-0 sudo[171125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:31 compute-0 python3.9[171127]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:55:31 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 09:55:31 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 22 09:55:31 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 22 09:55:31 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 22 09:55:31 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 22 09:55:31 compute-0 sudo[171125]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:31 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 22 09:55:31 compute-0 sudo[171282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvzuhuiiamnjiaqowcudttynvzamfijf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805331.5158026-662-244549517493022/AnsiballZ_dnf.py'
Nov 22 09:55:31 compute-0 sudo[171282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:32 compute-0 python3.9[171284]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 09:55:32 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 09:55:34 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 22 09:55:34 compute-0 systemd[1]: Reloading.
Nov 22 09:55:34 compute-0 systemd-rc-local-generator[171316]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:34 compute-0 systemd-sysv-generator[171319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:34 compute-0 systemd[1]: Reloading.
Nov 22 09:55:34 compute-0 systemd-rc-local-generator[171354]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:34 compute-0 systemd-sysv-generator[171358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:34 compute-0 systemd-logind[819]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 09:55:35 compute-0 systemd-logind[819]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 09:55:35 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 09:55:35 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 09:55:35 compute-0 systemd[1]: Reloading.
Nov 22 09:55:35 compute-0 systemd-rc-local-generator[171451]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:35 compute-0 systemd-sysv-generator[171455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:35 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 22 09:55:35 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 09:55:36 compute-0 sudo[171282]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:36 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 09:55:36 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 09:55:36 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.437s CPU time.
Nov 22 09:55:36 compute-0 systemd[1]: run-r0809c04715cc4701be18d4dfe1af789d.service: Deactivated successfully.
Nov 22 09:55:36 compute-0 sudo[172740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obljqqnvmagweenjghlcwbvzqvaqifks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805336.5482059-670-23490548719640/AnsiballZ_systemd_service.py'
Nov 22 09:55:36 compute-0 sudo[172740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:37 compute-0 python3.9[172742]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:55:37 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 22 09:55:37 compute-0 iscsid[161110]: iscsid shutting down.
Nov 22 09:55:37 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 22 09:55:37 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 22 09:55:37 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 09:55:37 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 22 09:55:37 compute-0 systemd[1]: Started Open-iSCSI.
Nov 22 09:55:37 compute-0 sudo[172740]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:38 compute-0 python3.9[172896]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:55:39 compute-0 sudo[173050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rugamduncvmaisnlzxtlccenlwhlioeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805338.8773227-688-89168819829811/AnsiballZ_file.py'
Nov 22 09:55:39 compute-0 sudo[173050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:39 compute-0 python3.9[173052]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:39 compute-0 sudo[173050]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:40 compute-0 sudo[173202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbgbzdjlhxbdvisnycofalpuujbynmer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805339.839147-699-92413951506203/AnsiballZ_systemd_service.py'
Nov 22 09:55:40 compute-0 sudo[173202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:40 compute-0 python3.9[173204]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:55:40 compute-0 systemd[1]: Reloading.
Nov 22 09:55:40 compute-0 systemd-rc-local-generator[173233]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:55:40 compute-0 systemd-sysv-generator[173238]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:55:40 compute-0 sudo[173202]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:41 compute-0 python3.9[173389]: ansible-ansible.builtin.service_facts Invoked
Nov 22 09:55:41 compute-0 network[173406]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 09:55:41 compute-0 network[173407]: 'network-scripts' will be removed from distribution in near future.
Nov 22 09:55:41 compute-0 network[173408]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 09:55:45 compute-0 podman[173494]: 2025-11-22 09:55:45.561513802 +0000 UTC m=+0.134489919 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 09:55:47 compute-0 sudo[173704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olmnhtobdtvfdomasszyjksjnjhaqwvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805347.6167836-718-256110147023531/AnsiballZ_systemd_service.py'
Nov 22 09:55:47 compute-0 sudo[173704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:48 compute-0 python3.9[173706]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:48 compute-0 sudo[173704]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:48 compute-0 sudo[173857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldxellhunsiqschplhnmalnoifbawcuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805348.402948-718-14785292935894/AnsiballZ_systemd_service.py'
Nov 22 09:55:48 compute-0 sudo[173857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:49 compute-0 python3.9[173859]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:49 compute-0 sudo[173857]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:49 compute-0 sudo[174010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msufrcylityvixyztqxaeooudobaseok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805349.1754081-718-249237037133989/AnsiballZ_systemd_service.py'
Nov 22 09:55:49 compute-0 sudo[174010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:49 compute-0 python3.9[174012]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:50 compute-0 sudo[174010]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:51 compute-0 sudo[174173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-perjddfwycpcwkthpkvxuvrinycmmdug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805351.0336807-718-186575735990704/AnsiballZ_systemd_service.py'
Nov 22 09:55:51 compute-0 sudo[174173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:51 compute-0 podman[174137]: 2025-11-22 09:55:51.445403136 +0000 UTC m=+0.076293659 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 09:55:51 compute-0 python3.9[174181]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:52 compute-0 sudo[174173]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:53 compute-0 sudo[174334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygkzjihinkrcmdsxlqrehvznapzfnzwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805352.9126372-718-45565629772511/AnsiballZ_systemd_service.py'
Nov 22 09:55:53 compute-0 sudo[174334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:53 compute-0 python3.9[174336]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:53 compute-0 sudo[174334]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:53 compute-0 sudo[174487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-robiesovdbedhaaqhfitueiudeghotyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805353.6643665-718-35338294632086/AnsiballZ_systemd_service.py'
Nov 22 09:55:53 compute-0 sudo[174487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:54 compute-0 python3.9[174489]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:54 compute-0 sudo[174487]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:54 compute-0 sudo[174640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kicsslakdvwxdzjjdrgfecihicltmqkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805354.6902287-718-70053936660475/AnsiballZ_systemd_service.py'
Nov 22 09:55:55 compute-0 sudo[174640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:55 compute-0 python3.9[174642]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:55 compute-0 sudo[174640]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:55 compute-0 sudo[174793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwwhgclyrlzkhtvaoulhbtcmpznavjxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805355.4539363-718-35946471878951/AnsiballZ_systemd_service.py'
Nov 22 09:55:55 compute-0 sudo[174793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:56 compute-0 python3.9[174795]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:55:56 compute-0 sudo[174793]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:56 compute-0 podman[174797]: 2025-11-22 09:55:56.092418846 +0000 UTC m=+0.050209776 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:55:56 compute-0 sudo[174967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwjlajqxixrtmxhnnjckmrvrtcqetdei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805356.3367758-777-238497735510631/AnsiballZ_file.py'
Nov 22 09:55:56 compute-0 sudo[174967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:56 compute-0 python3.9[174969]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:56 compute-0 sudo[174967]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:57 compute-0 sudo[175119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxuezjapjeyzbjpoweqtodiyracralbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805356.9922025-777-11001196369129/AnsiballZ_file.py'
Nov 22 09:55:57 compute-0 sudo[175119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:57 compute-0 python3.9[175121]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:57 compute-0 sudo[175119]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:57 compute-0 sudo[175271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmixsmtaymjvrndiogwyvhhbzfuzvndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805357.6498516-777-164800588308004/AnsiballZ_file.py'
Nov 22 09:55:57 compute-0 sudo[175271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:58 compute-0 python3.9[175273]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:58 compute-0 sudo[175271]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:58 compute-0 sudo[175423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azyfkslboxqccfohkwwheniepycvrzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805358.2957997-777-18854385494864/AnsiballZ_file.py'
Nov 22 09:55:58 compute-0 sudo[175423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:58 compute-0 python3.9[175425]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:58 compute-0 sudo[175423]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:59 compute-0 sudo[175575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojgjfrbwvdijfwobvgbxdewwunazknky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805358.9549806-777-8895820759358/AnsiballZ_file.py'
Nov 22 09:55:59 compute-0 sudo[175575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:55:59 compute-0 python3.9[175577]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:55:59 compute-0 sudo[175575]: pam_unix(sudo:session): session closed for user root
Nov 22 09:55:59 compute-0 sudo[175727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxhyzfhsehzcmepttphuvgophvnxhlss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805359.6595285-777-77062182164440/AnsiballZ_file.py'
Nov 22 09:55:59 compute-0 sudo[175727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:00 compute-0 python3.9[175729]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:00 compute-0 sudo[175727]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:00 compute-0 sudo[175879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbwcamdbgqkubyqgmsqlwcuflsdpqlsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805360.4075482-777-168404120405769/AnsiballZ_file.py'
Nov 22 09:56:00 compute-0 sudo[175879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:00 compute-0 python3.9[175881]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:00 compute-0 sudo[175879]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:01 compute-0 sudo[176031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mubrpqwunbbrrmxygqtenooshsfzopcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805361.0708816-777-34565750094726/AnsiballZ_file.py'
Nov 22 09:56:01 compute-0 sudo[176031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:01 compute-0 python3.9[176033]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:01 compute-0 sudo[176031]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:02 compute-0 sudo[176183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwcfyfwmpbmnbnhqyjqnqzqhwdykywi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805361.7678945-834-172766684049984/AnsiballZ_file.py'
Nov 22 09:56:02 compute-0 sudo[176183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:02 compute-0 python3.9[176185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:02 compute-0 sudo[176183]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:02 compute-0 sudo[176335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfidkdhkaydabhlmvemrkpnwncuqlpze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805362.4864748-834-159632165357149/AnsiballZ_file.py'
Nov 22 09:56:02 compute-0 sudo[176335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:02 compute-0 python3.9[176337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:03 compute-0 sudo[176335]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:03 compute-0 sudo[176487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwgtpizigdhreemqbcnjbyurvwpaflqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805363.1569743-834-72584330443465/AnsiballZ_file.py'
Nov 22 09:56:03 compute-0 sudo[176487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:03 compute-0 python3.9[176489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:03 compute-0 sudo[176487]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:04 compute-0 sudo[176639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egdtojefptznasqtbypuruqzjmqakzan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805363.9026244-834-126187068469140/AnsiballZ_file.py'
Nov 22 09:56:04 compute-0 sudo[176639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:04 compute-0 python3.9[176641]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:04 compute-0 sudo[176639]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:05 compute-0 sudo[176791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zheewnmciathofhkpenmdiirulsrdkwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805364.8096275-834-2540975301506/AnsiballZ_file.py'
Nov 22 09:56:05 compute-0 sudo[176791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:05 compute-0 python3.9[176793]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:05 compute-0 sudo[176791]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:05 compute-0 sudo[176943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okuzotnlejmwwpyvqfkmwixuctvblthp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805365.4040034-834-62290539961500/AnsiballZ_file.py'
Nov 22 09:56:05 compute-0 sudo[176943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:06 compute-0 python3.9[176945]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:06 compute-0 sudo[176943]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:06 compute-0 sudo[177095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqdeqndcucucvjhtesvmkanssyikauhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805366.2996116-834-214794903169755/AnsiballZ_file.py'
Nov 22 09:56:06 compute-0 sudo[177095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:06 compute-0 python3.9[177097]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:06 compute-0 sudo[177095]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:07 compute-0 sudo[177247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxantgfislnthlmctuezoolhxbomeltr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805366.9135997-834-167515666588228/AnsiballZ_file.py'
Nov 22 09:56:07 compute-0 sudo[177247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:07 compute-0 python3.9[177249]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:07 compute-0 sudo[177247]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:08 compute-0 sudo[177399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccezcufxnjkmyyqhgpromvgusmxwaagw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805367.8261576-892-87452662823326/AnsiballZ_command.py'
Nov 22 09:56:08 compute-0 sudo[177399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:08 compute-0 python3.9[177401]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:08 compute-0 sudo[177399]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:09 compute-0 python3.9[177553]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 09:56:09 compute-0 sudo[177703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vapcxzwiwaxjcfpgrojkmihaexknjhlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805369.4916053-910-233507142633284/AnsiballZ_systemd_service.py'
Nov 22 09:56:09 compute-0 sudo[177703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:10 compute-0 python3.9[177705]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:56:10 compute-0 systemd[1]: Reloading.
Nov 22 09:56:10 compute-0 systemd-sysv-generator[177734]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:56:10 compute-0 systemd-rc-local-generator[177731]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:56:10 compute-0 sudo[177703]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:10 compute-0 sudo[177891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlwykhkkuumecpsdfhmgwuacnbclalyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805370.5591347-918-191139807751715/AnsiballZ_command.py'
Nov 22 09:56:10 compute-0 sudo[177891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:10 compute-0 python3.9[177893]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:11 compute-0 sudo[177891]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:11 compute-0 sudo[178044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhezbvourttxipdlmfitxkmrcstuxndd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805371.1362379-918-16287737719405/AnsiballZ_command.py'
Nov 22 09:56:11 compute-0 sudo[178044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:11 compute-0 python3.9[178046]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:11 compute-0 sudo[178044]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:12 compute-0 sudo[178197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bicfwspixtqcayvpycmxtxfmmvuukeng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805371.7282448-918-41410837172499/AnsiballZ_command.py'
Nov 22 09:56:12 compute-0 sudo[178197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:12 compute-0 python3.9[178199]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:12 compute-0 sudo[178197]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:12 compute-0 sudo[178350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doupxmnttwaiizvskdhaiygyyyerqxxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805372.4221709-918-13427646748740/AnsiballZ_command.py'
Nov 22 09:56:12 compute-0 sudo[178350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:12 compute-0 python3.9[178352]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:12 compute-0 sudo[178350]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:13 compute-0 sudo[178503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwlrugzhjsjrsycerkjcfygbjutmjtvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805373.0817535-918-228517823851783/AnsiballZ_command.py'
Nov 22 09:56:13 compute-0 sudo[178503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:13 compute-0 python3.9[178505]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:13 compute-0 sudo[178503]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:13 compute-0 sudo[178656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmgwmdmakgmwkhpmbeafplnhkjaejonp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805373.645938-918-124471954652466/AnsiballZ_command.py'
Nov 22 09:56:13 compute-0 sudo[178656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:14 compute-0 python3.9[178658]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:14 compute-0 sudo[178656]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:14 compute-0 sudo[178809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkfbyulbymhhegbsdlcngeqceyaefzlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805374.2507071-918-11312082847589/AnsiballZ_command.py'
Nov 22 09:56:14 compute-0 sudo[178809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:14 compute-0 python3.9[178811]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:14 compute-0 sudo[178809]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:15 compute-0 sudo[178962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwafafgffvdfhnlqimouqnqmczlypmmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805375.056482-918-201443555724402/AnsiballZ_command.py'
Nov 22 09:56:15 compute-0 sudo[178962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:15 compute-0 python3.9[178964]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:56:15 compute-0 sudo[178962]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:15 compute-0 podman[178966]: 2025-11-22 09:56:15.772277165 +0000 UTC m=+0.109258428 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 09:56:17 compute-0 sudo[179142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guhybqxspszwpnbdiqbslfhfeztxnzvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805376.713369-997-118348555368493/AnsiballZ_file.py'
Nov 22 09:56:17 compute-0 sudo[179142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:17 compute-0 python3.9[179144]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:17 compute-0 sudo[179142]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:17 compute-0 sudo[179294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbloztdetxmfqxrwpnzaevtclweygoii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805377.4993339-997-169228554097343/AnsiballZ_file.py'
Nov 22 09:56:17 compute-0 sudo[179294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:56:17.919 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:56:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:56:17.920 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:56:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:56:17.920 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:56:18 compute-0 python3.9[179296]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:18 compute-0 sudo[179294]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:18 compute-0 sudo[179446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpugzkyjkcbykqdcduhxmndkiiwjgppy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805378.180662-997-11621624420805/AnsiballZ_file.py'
Nov 22 09:56:18 compute-0 sudo[179446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:18 compute-0 python3.9[179448]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:18 compute-0 sudo[179446]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:19 compute-0 sudo[179598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bswocojoaypslrtajyxxuxnihckgecmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805378.9977558-1019-119349989387358/AnsiballZ_file.py'
Nov 22 09:56:19 compute-0 sudo[179598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:19 compute-0 python3.9[179600]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:19 compute-0 sudo[179598]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:20 compute-0 sudo[179750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmalldyekluleqkefqesrwlrismkqwgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805379.7546923-1019-132982876187404/AnsiballZ_file.py'
Nov 22 09:56:20 compute-0 sudo[179750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:20 compute-0 python3.9[179752]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:20 compute-0 sudo[179750]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:20 compute-0 sudo[179902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbpuzyvxgqfbujllhpsehuaxwlykqwkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805380.431373-1019-134003945354123/AnsiballZ_file.py'
Nov 22 09:56:20 compute-0 sudo[179902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:20 compute-0 python3.9[179904]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:20 compute-0 sudo[179902]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:21 compute-0 sudo[180054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pogmcixjpajmppppoqheslbhodathdpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805381.1139295-1019-102283160046715/AnsiballZ_file.py'
Nov 22 09:56:21 compute-0 sudo[180054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:21 compute-0 python3.9[180056]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:21 compute-0 sudo[180054]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:21 compute-0 podman[180057]: 2025-11-22 09:56:21.599048097 +0000 UTC m=+0.057649857 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:56:21 compute-0 sudo[180225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paxriowdwxvqraaxvmckgbruixfwvvcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805381.6895053-1019-75129131610259/AnsiballZ_file.py'
Nov 22 09:56:21 compute-0 sudo[180225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:22 compute-0 python3.9[180227]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:22 compute-0 sudo[180225]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:22 compute-0 sudo[180377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcxofmviscbsbemtlswiajckolfpbkrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805382.3229816-1019-258898392111151/AnsiballZ_file.py'
Nov 22 09:56:22 compute-0 sudo[180377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:22 compute-0 python3.9[180379]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:22 compute-0 sudo[180377]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:23 compute-0 sudo[180529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsipvqhospsmoigwtbbxukdpydgsaskj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805383.049342-1019-181827849930545/AnsiballZ_file.py'
Nov 22 09:56:23 compute-0 sudo[180529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:23 compute-0 python3.9[180531]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:23 compute-0 sudo[180529]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:26 compute-0 podman[180556]: 2025-11-22 09:56:26.607259218 +0000 UTC m=+0.068240692 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 09:56:28 compute-0 sudo[180702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwxouasrwilyhzeluuvbndzmwugkjvny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805388.562192-1188-128145485403663/AnsiballZ_getent.py'
Nov 22 09:56:28 compute-0 sudo[180702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:29 compute-0 python3.9[180704]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 22 09:56:29 compute-0 sudo[180702]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:29 compute-0 sudo[180855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfcdtbciwvnsftrxwndnaaupmgqxfmhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805389.327-1196-17430759597061/AnsiballZ_group.py'
Nov 22 09:56:29 compute-0 sudo[180855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:29 compute-0 python3.9[180857]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 09:56:29 compute-0 groupadd[180858]: group added to /etc/group: name=nova, GID=42436
Nov 22 09:56:29 compute-0 groupadd[180858]: group added to /etc/gshadow: name=nova
Nov 22 09:56:29 compute-0 groupadd[180858]: new group: name=nova, GID=42436
Nov 22 09:56:30 compute-0 sudo[180855]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:30 compute-0 sudo[181013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sumayflyvsejvbygkhppeyqqhwysleez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805390.1829283-1204-194336451236983/AnsiballZ_user.py'
Nov 22 09:56:30 compute-0 sudo[181013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:30 compute-0 python3.9[181015]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 09:56:31 compute-0 useradd[181017]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 22 09:56:31 compute-0 useradd[181017]: add 'nova' to group 'libvirt'
Nov 22 09:56:31 compute-0 useradd[181017]: add 'nova' to shadow group 'libvirt'
Nov 22 09:56:31 compute-0 sudo[181013]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:32 compute-0 sshd-session[181048]: Accepted publickey for zuul from 192.168.122.30 port 42468 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:56:32 compute-0 systemd-logind[819]: New session 24 of user zuul.
Nov 22 09:56:32 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 22 09:56:32 compute-0 sshd-session[181048]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:56:32 compute-0 sshd-session[181051]: Received disconnect from 192.168.122.30 port 42468:11: disconnected by user
Nov 22 09:56:32 compute-0 sshd-session[181051]: Disconnected from user zuul 192.168.122.30 port 42468
Nov 22 09:56:32 compute-0 sshd-session[181048]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:56:32 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 22 09:56:32 compute-0 systemd-logind[819]: Session 24 logged out. Waiting for processes to exit.
Nov 22 09:56:32 compute-0 systemd-logind[819]: Removed session 24.
Nov 22 09:56:32 compute-0 python3.9[181201]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:33 compute-0 python3.9[181322]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805392.5017045-1229-67324136354584/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:34 compute-0 python3.9[181472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:34 compute-0 python3.9[181548]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:35 compute-0 python3.9[181698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:36 compute-0 python3.9[181819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805394.8508613-1229-226117012413683/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:36 compute-0 python3.9[181969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:37 compute-0 python3.9[182090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805396.4036133-1229-28266089693778/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:37 compute-0 python3.9[182240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:38 compute-0 python3.9[182361]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805397.5749688-1229-120005958751785/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:39 compute-0 python3.9[182511]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:39 compute-0 python3.9[182632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805398.6821015-1229-70938332964587/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:40 compute-0 sudo[182782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srgjbobeppbrglxluigiqqkrkzerbqxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805399.9770293-1312-236488024831318/AnsiballZ_file.py'
Nov 22 09:56:40 compute-0 sudo[182782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:40 compute-0 python3.9[182784]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:40 compute-0 sudo[182782]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:40 compute-0 sudo[182934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkhjvtriakoifydymzxdnicyfjorqohl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805400.6801882-1320-9966992265069/AnsiballZ_copy.py'
Nov 22 09:56:40 compute-0 sudo[182934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:41 compute-0 python3.9[182936]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:41 compute-0 sudo[182934]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:41 compute-0 sudo[183086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruvsnnaogbnqrxklmiriaftubhudqvgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805401.3342562-1328-63352683689078/AnsiballZ_stat.py'
Nov 22 09:56:41 compute-0 sudo[183086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:41 compute-0 python3.9[183088]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:56:41 compute-0 sudo[183086]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:42 compute-0 sudo[183238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoywwgvrzhdxcnmtgtaychhvbmagflyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805402.0302289-1336-208240224928552/AnsiballZ_stat.py'
Nov 22 09:56:42 compute-0 sudo[183238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:42 compute-0 python3.9[183240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:42 compute-0 sudo[183238]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:42 compute-0 sudo[183361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtpummvzkgdaaxtbwpbcqdcypfqxbbwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805402.0302289-1336-208240224928552/AnsiballZ_copy.py'
Nov 22 09:56:42 compute-0 sudo[183361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:43 compute-0 python3.9[183363]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763805402.0302289-1336-208240224928552/.source _original_basename=.c_zwbs9a follow=False checksum=10a08135fa2dc38c5a714c523e615343261fcfa8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 22 09:56:43 compute-0 sudo[183361]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:43 compute-0 python3.9[183515]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:56:44 compute-0 python3.9[183667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:45 compute-0 python3.9[183788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805404.1063316-1362-171862457350167/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:45 compute-0 python3.9[183938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:56:46 compute-0 podman[184033]: 2025-11-22 09:56:46.432028032 +0000 UTC m=+0.154293184 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true)
Nov 22 09:56:46 compute-0 python3.9[184073]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805405.3645365-1377-21038069311267/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:56:47 compute-0 sudo[184237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxlhgstdnubsqbpxozxkfwxofednhbfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805406.803717-1394-70899351393117/AnsiballZ_container_config_data.py'
Nov 22 09:56:47 compute-0 sudo[184237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:47 compute-0 python3.9[184239]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 22 09:56:47 compute-0 sudo[184237]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:47 compute-0 sudo[184389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfuafzjyzvvznvpuckvddkqksnvptwgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805407.545579-1403-138570019215602/AnsiballZ_container_config_hash.py'
Nov 22 09:56:47 compute-0 sudo[184389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:48 compute-0 python3.9[184391]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:56:48 compute-0 sudo[184389]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:48 compute-0 sudo[184541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goeidjiixsjobgxhblbzxsnayndhhprt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805408.3656886-1413-103831887189175/AnsiballZ_edpm_container_manage.py'
Nov 22 09:56:48 compute-0 sudo[184541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:48 compute-0 python3[184543]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:56:49 compute-0 podman[184578]: 2025-11-22 09:56:49.205687511 +0000 UTC m=+0.035015116 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 09:56:49 compute-0 podman[184578]: 2025-11-22 09:56:49.400501428 +0000 UTC m=+0.229828943 container create f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.license=GPLv2, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 22 09:56:49 compute-0 python3[184543]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 22 09:56:49 compute-0 sudo[184541]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:50 compute-0 sudo[184766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdahgbiiqwvxkdtbqfiapbukglkhpeqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805409.7677839-1421-3646629180304/AnsiballZ_stat.py'
Nov 22 09:56:50 compute-0 sudo[184766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:50 compute-0 python3.9[184768]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:56:50 compute-0 sudo[184766]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:51 compute-0 sudo[184920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbtemhuzstlhplrtyybnpfkplwzuoabs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805410.7426343-1433-130341670004544/AnsiballZ_container_config_data.py'
Nov 22 09:56:51 compute-0 sudo[184920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:51 compute-0 python3.9[184922]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 22 09:56:51 compute-0 sudo[184920]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:51 compute-0 sudo[185081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvnpekiqhusxyqnxmcqdlyrlvioxawim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805411.5450768-1442-239585256926228/AnsiballZ_container_config_hash.py'
Nov 22 09:56:51 compute-0 sudo[185081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:51 compute-0 podman[185046]: 2025-11-22 09:56:51.833825503 +0000 UTC m=+0.055172279 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 09:56:52 compute-0 python3.9[185094]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:56:52 compute-0 sudo[185081]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:52 compute-0 sudo[185244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwguvnzynzluuiwhczduefqbwkpfvbcu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805412.3566425-1452-153055519063257/AnsiballZ_edpm_container_manage.py'
Nov 22 09:56:52 compute-0 sudo[185244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:52 compute-0 python3[185246]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:56:53 compute-0 podman[185286]: 2025-11-22 09:56:53.264411774 +0000 UTC m=+0.070198495 container create 711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 09:56:53 compute-0 podman[185286]: 2025-11-22 09:56:53.221898087 +0000 UTC m=+0.027684808 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 09:56:53 compute-0 python3[185246]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 22 09:56:53 compute-0 sudo[185244]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:53 compute-0 sudo[185474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dndxiyjszrugymcucreddetxqwlmrqie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805413.5967605-1460-237883447610077/AnsiballZ_stat.py'
Nov 22 09:56:53 compute-0 sudo[185474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:54 compute-0 python3.9[185476]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:56:54 compute-0 sudo[185474]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:54 compute-0 sudo[185628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcohfwbitahwaxrfqsjcxodeklmemzax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805414.4373305-1469-244329258224837/AnsiballZ_file.py'
Nov 22 09:56:54 compute-0 sudo[185628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:55 compute-0 python3.9[185630]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:55 compute-0 sudo[185628]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:55 compute-0 sudo[185779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnzjlhvkbqqzzvqxmumindxtgibwrwgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805415.0793421-1469-64258922929567/AnsiballZ_copy.py'
Nov 22 09:56:55 compute-0 sudo[185779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:55 compute-0 python3.9[185781]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763805415.0793421-1469-64258922929567/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:56:55 compute-0 sudo[185779]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:55 compute-0 sudo[185855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npxaxssocqlbeoewxilehgaqmnkmsjqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805415.0793421-1469-64258922929567/AnsiballZ_systemd.py'
Nov 22 09:56:55 compute-0 sudo[185855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:56 compute-0 python3.9[185857]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:56:56 compute-0 systemd[1]: Reloading.
Nov 22 09:56:56 compute-0 systemd-rc-local-generator[185886]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:56:56 compute-0 systemd-sysv-generator[185890]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:56:56 compute-0 sudo[185855]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:56 compute-0 podman[185893]: 2025-11-22 09:56:56.807953626 +0000 UTC m=+0.079832305 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 09:56:56 compute-0 sudo[185986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvdyismzttnpdjltngekwyusnkwsekgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805415.0793421-1469-64258922929567/AnsiballZ_systemd.py'
Nov 22 09:56:56 compute-0 sudo[185986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:56:57 compute-0 python3.9[185988]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:56:57 compute-0 systemd[1]: Reloading.
Nov 22 09:56:57 compute-0 systemd-rc-local-generator[186018]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:56:57 compute-0 systemd-sysv-generator[186021]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:56:57 compute-0 systemd[1]: Starting nova_compute container...
Nov 22 09:56:57 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:56:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 09:56:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 09:56:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 09:56:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 09:56:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 09:56:58 compute-0 podman[186028]: 2025-11-22 09:56:58.034260225 +0000 UTC m=+0.320535881 container init 711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:56:58 compute-0 podman[186028]: 2025-11-22 09:56:58.040515763 +0000 UTC m=+0.326791389 container start 711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible)
Nov 22 09:56:58 compute-0 nova_compute[186044]: + sudo -E kolla_set_configs
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Validating config file
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying service configuration files
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Deleting /etc/ceph
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Creating directory /etc/ceph
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Writing out command to execute
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:56:58 compute-0 nova_compute[186044]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 09:56:58 compute-0 nova_compute[186044]: ++ cat /run_command
Nov 22 09:56:58 compute-0 nova_compute[186044]: + CMD=nova-compute
Nov 22 09:56:58 compute-0 nova_compute[186044]: + ARGS=
Nov 22 09:56:58 compute-0 nova_compute[186044]: + sudo kolla_copy_cacerts
Nov 22 09:56:58 compute-0 nova_compute[186044]: + [[ ! -n '' ]]
Nov 22 09:56:58 compute-0 nova_compute[186044]: + . kolla_extend_start
Nov 22 09:56:58 compute-0 nova_compute[186044]: Running command: 'nova-compute'
Nov 22 09:56:58 compute-0 nova_compute[186044]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 09:56:58 compute-0 nova_compute[186044]: + umask 0022
Nov 22 09:56:58 compute-0 nova_compute[186044]: + exec nova-compute
Nov 22 09:56:58 compute-0 podman[186028]: nova_compute
Nov 22 09:56:58 compute-0 systemd[1]: Started nova_compute container.
Nov 22 09:56:58 compute-0 sudo[185986]: pam_unix(sudo:session): session closed for user root
Nov 22 09:56:59 compute-0 python3.9[186206]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.102 186048 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.102 186048 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.103 186048 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.103 186048 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 22 09:57:00 compute-0 python3.9[186356]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.235 186048 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.264 186048 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.264 186048 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 22 09:57:00 compute-0 nova_compute[186044]: 2025-11-22 09:57:00.882 186048 INFO nova.virt.driver [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.003 186048 INFO nova.compute.provider_config [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.018 186048 DEBUG oslo_concurrency.lockutils [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.018 186048 DEBUG oslo_concurrency.lockutils [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.018 186048 DEBUG oslo_concurrency.lockutils [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.019 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.019 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.019 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.019 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.019 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.019 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.020 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.020 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.020 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.020 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.020 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.020 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.020 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.021 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.022 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.022 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.022 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.022 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.022 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.022 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.023 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.024 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.024 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.024 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.024 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.024 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.024 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.024 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.025 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.025 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.025 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.025 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.025 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.025 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.025 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.026 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.026 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.026 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.026 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.026 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.026 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.026 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.027 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.027 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.027 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.027 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.027 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.027 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.027 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.028 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.029 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.029 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.029 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.029 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.029 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.029 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.029 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.030 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.031 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.031 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.031 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.031 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.031 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.031 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.031 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.032 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.032 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.032 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.032 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.032 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.032 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.032 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.033 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.033 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.033 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.033 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.033 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.033 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.033 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.034 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.034 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.034 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.034 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.034 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.034 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.034 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.035 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.035 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.035 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.035 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.035 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.035 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.035 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.036 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.037 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.037 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.037 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.037 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.037 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.037 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.037 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.038 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.038 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.038 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.038 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.038 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.038 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.038 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.039 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.039 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.039 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.039 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.039 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.039 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.039 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.040 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.040 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.040 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.040 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.040 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.040 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.040 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.041 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.041 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.041 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.041 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.041 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.041 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.041 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.042 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.042 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.042 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.042 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.042 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.042 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 python3.9[186510]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.042 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.043 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.043 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.043 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.043 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.043 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.043 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.043 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.044 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.044 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.044 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.044 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.044 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.045 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.045 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.045 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.045 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.045 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.045 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.045 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.046 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.047 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.047 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.047 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.047 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.047 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.047 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.048 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.048 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.048 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.048 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.048 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.048 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.048 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.049 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.049 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.049 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.049 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.049 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.049 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.050 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.050 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.050 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.050 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.050 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.050 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.050 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.051 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.051 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.051 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.051 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.051 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.051 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.051 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.052 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.052 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.052 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.052 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.052 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.052 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.052 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.053 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.053 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.053 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.053 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.053 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.053 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.053 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.054 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.055 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.055 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.055 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.055 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.055 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.055 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.055 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.056 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.056 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.056 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.056 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.056 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.056 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.056 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.057 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.057 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.057 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.057 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.057 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.057 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.058 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.058 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.058 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.058 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.058 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.058 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.058 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.059 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.059 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.059 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.059 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.059 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.059 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.059 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.060 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.060 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.060 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.060 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.060 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.060 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.060 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.061 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.061 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.061 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.061 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.061 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.061 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.062 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.062 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.062 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.062 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.062 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.062 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.062 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.063 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.063 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.063 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.063 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.063 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.063 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.063 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.064 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.064 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.064 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.064 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.064 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.064 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.064 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.065 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.065 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.065 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.065 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.065 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.065 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.065 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.066 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.066 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.066 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.066 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.066 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.066 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.066 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.067 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.067 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.067 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.067 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.067 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.067 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.067 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.068 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.068 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.068 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.068 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.068 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.068 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.069 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.069 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.069 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.069 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.069 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.069 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.069 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.070 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.070 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.070 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.070 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.070 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.071 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.071 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.071 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.071 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.071 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.071 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.072 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.072 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.072 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.072 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.072 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.072 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.072 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.073 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.073 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.073 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.073 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.073 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.073 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.073 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.074 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.074 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.074 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.074 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.074 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.074 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.074 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.075 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.075 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.075 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.075 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.075 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.075 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.076 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.076 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.076 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.076 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.076 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.076 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.077 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.077 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.077 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.077 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.077 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.077 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.077 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.078 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.078 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.078 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.078 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.078 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.078 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.079 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.079 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.079 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.079 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.079 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.079 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.080 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.080 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.080 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.080 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.080 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.080 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.080 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.081 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.081 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.081 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.081 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.081 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.081 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.081 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.082 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.082 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.082 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.082 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.082 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.082 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.082 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.083 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.083 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.083 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.083 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.083 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.083 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.083 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.084 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.084 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.084 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.084 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.084 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.084 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.085 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.085 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.085 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.085 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.085 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.085 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.086 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.086 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.086 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.086 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.086 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.086 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.086 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.087 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.087 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.087 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.087 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.087 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.087 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.087 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.088 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.088 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.088 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.088 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.088 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.089 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.089 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.089 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.089 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.089 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.089 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.089 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.090 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.090 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.090 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.090 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.090 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.090 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.091 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.091 186048 WARNING oslo_config.cfg [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 09:57:01 compute-0 nova_compute[186044]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 09:57:01 compute-0 nova_compute[186044]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 09:57:01 compute-0 nova_compute[186044]: and ``live_migration_inbound_addr`` respectively.
Nov 22 09:57:01 compute-0 nova_compute[186044]: ).  Its value may be silently ignored in the future.
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.091 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.091 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.091 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.091 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.092 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.092 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.092 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.092 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.092 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.092 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.092 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.093 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.093 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.093 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.093 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.093 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.093 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.093 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.094 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.095 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.095 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.095 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.095 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.095 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.095 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.096 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.096 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.096 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.096 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.096 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.096 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.096 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.097 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.097 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.097 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.097 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.097 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.097 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.097 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.098 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.098 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.098 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.098 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.098 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.098 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.098 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.099 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.099 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.099 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.099 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.099 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.099 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.100 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.101 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.101 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.101 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.101 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.101 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.101 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.101 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.102 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.102 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.102 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.102 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.102 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.102 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.102 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.103 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.103 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.103 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.103 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.103 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.103 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.104 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.104 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.104 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.104 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.104 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.104 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.104 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.105 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.105 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.105 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.105 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.105 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.105 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.105 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.106 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.107 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.108 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.108 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.108 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.108 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.108 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.108 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.108 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.109 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.109 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.109 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.109 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.109 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.109 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.109 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.110 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.110 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.110 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.110 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.110 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.110 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.110 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.111 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.111 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.111 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.111 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.111 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.111 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.111 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.112 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.112 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.112 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.112 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.112 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.112 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.112 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.113 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.113 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.113 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.113 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.113 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.113 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.113 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.114 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.114 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.114 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.114 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.114 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.114 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.114 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.115 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.115 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.115 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.115 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.115 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.115 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.115 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.116 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.116 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.116 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.116 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.116 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.116 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.116 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.117 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.118 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.118 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.118 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.118 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.118 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.118 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.118 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.119 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.119 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.119 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.119 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.119 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.119 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.119 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.120 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.120 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.120 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.120 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.120 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.120 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.120 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.121 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.122 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.122 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.122 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.122 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.122 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.122 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.122 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.123 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.123 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.123 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.123 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.123 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.123 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.123 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.124 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.125 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.125 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.125 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.125 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.125 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.125 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.125 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.126 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.126 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.126 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.126 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.126 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.126 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.127 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.127 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.127 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.127 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.127 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.127 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.127 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.128 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.128 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.128 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.128 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.128 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.128 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.128 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.129 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.130 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.130 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.130 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.130 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.130 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.130 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.130 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.131 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.131 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.131 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.131 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.131 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.131 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.131 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.132 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.132 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.132 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.132 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.132 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.132 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.132 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.133 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.133 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.133 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.133 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.133 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.133 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.133 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.134 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.135 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.135 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.135 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.135 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.135 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.135 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.135 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.136 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.136 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.136 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.136 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.136 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.136 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.136 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.137 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.137 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.137 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.137 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.137 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.137 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.137 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.138 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.138 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.138 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.138 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.138 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.138 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.138 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.139 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.139 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.139 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.139 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.139 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.139 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.139 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.140 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.140 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.140 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.140 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.140 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.140 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.140 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.141 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.142 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.142 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.142 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.142 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.142 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.142 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.142 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.143 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.144 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.144 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.144 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.144 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.144 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.144 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.144 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.145 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.146 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.146 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.146 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.146 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.146 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.146 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.146 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.147 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.148 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.148 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.148 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.148 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.148 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.148 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.148 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.149 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.150 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.150 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.150 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.150 186048 DEBUG oslo_service.service [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.151 186048 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.166 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.167 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.167 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.167 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 22 09:57:01 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 09:57:01 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.233 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f04adf94df0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.236 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f04adf94df0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.238 186048 INFO nova.virt.libvirt.driver [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Connection event '1' reason 'None'
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.263 186048 WARNING nova.virt.libvirt.driver [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 22 09:57:01 compute-0 nova_compute[186044]: 2025-11-22 09:57:01.264 186048 DEBUG nova.virt.libvirt.volume.mount [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 22 09:57:01 compute-0 sudo[186720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtuggxzvycokufyzyfqhghfgjyygpxxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805421.344256-1529-161976842246587/AnsiballZ_podman_container.py'
Nov 22 09:57:01 compute-0 sudo[186720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.074 186048 INFO nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]: 
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <host>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <uuid>451f2cde-49d1-45fa-bcb2-7147a4a4b091</uuid>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <arch>x86_64</arch>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model>EPYC-Rome-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <vendor>AMD</vendor>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <microcode version='16777317'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <signature family='23' model='49' stepping='0'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='x2apic'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='tsc-deadline'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='osxsave'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='hypervisor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='tsc_adjust'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='spec-ctrl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='stibp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='arch-capabilities'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='cmp_legacy'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='topoext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='virt-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='lbrv'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='tsc-scale'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='vmcb-clean'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='pause-filter'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='pfthreshold'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='svme-addr-chk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='rdctl-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='skip-l1dfl-vmentry'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='mds-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature name='pschange-mc-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <pages unit='KiB' size='4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <pages unit='KiB' size='2048'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <pages unit='KiB' size='1048576'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <power_management>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <suspend_mem/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <suspend_disk/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <suspend_hybrid/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </power_management>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <iommu support='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <migration_features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <live/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <uri_transports>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <uri_transport>tcp</uri_transport>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <uri_transport>rdma</uri_transport>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </uri_transports>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </migration_features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <topology>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <cells num='1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <cell id='0'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           <memory unit='KiB'>7864320</memory>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           <pages unit='KiB' size='2048'>0</pages>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           <distances>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <sibling id='0' value='10'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           </distances>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           <cpus num='8'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:           </cpus>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         </cell>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </cells>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </topology>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <cache>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </cache>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <secmodel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model>selinux</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <doi>0</doi>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </secmodel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <secmodel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model>dac</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <doi>0</doi>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </secmodel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </host>
Nov 22 09:57:02 compute-0 nova_compute[186044]: 
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <guest>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <os_type>hvm</os_type>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <arch name='i686'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <wordsize>32</wordsize>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <domain type='qemu'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <domain type='kvm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </arch>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <pae/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <nonpae/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <acpi default='on' toggle='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <apic default='on' toggle='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <cpuselection/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <deviceboot/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <disksnapshot default='on' toggle='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <externalSnapshot/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </guest>
Nov 22 09:57:02 compute-0 nova_compute[186044]: 
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <guest>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <os_type>hvm</os_type>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <arch name='x86_64'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <wordsize>64</wordsize>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <domain type='qemu'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <domain type='kvm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </arch>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <acpi default='on' toggle='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <apic default='on' toggle='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <cpuselection/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <deviceboot/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <disksnapshot default='on' toggle='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <externalSnapshot/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </guest>
Nov 22 09:57:02 compute-0 nova_compute[186044]: 
Nov 22 09:57:02 compute-0 nova_compute[186044]: </capabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]: 
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.082 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.103 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 09:57:02 compute-0 nova_compute[186044]: <domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <domain>kvm</domain>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <arch>i686</arch>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <vcpu max='240'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <iothreads supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <os supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='firmware'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <loader supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>rom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pflash</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='readonly'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>yes</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='secure'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </loader>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </os>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='maximumMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <vendor>AMD</vendor>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='succor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='custom' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-128'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-256'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-512'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 python3.9[186722]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <memoryBacking supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='sourceType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>anonymous</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>memfd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </memoryBacking>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <disk supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='diskDevice'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>disk</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cdrom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>floppy</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>lun</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ide</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>fdc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>sata</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </disk>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <graphics supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vnc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egl-headless</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </graphics>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <video supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='modelType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vga</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cirrus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>none</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>bochs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ramfb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </video>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hostdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='mode'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>subsystem</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='startupPolicy'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>mandatory</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>requisite</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>optional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='subsysType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pci</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='capsType'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='pciBackend'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hostdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <rng supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>random</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </rng>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <filesystem supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='driverType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>path</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>handle</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtiofs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </filesystem>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <tpm supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-tis</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-crb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emulator</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>external</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendVersion'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>2.0</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </tpm>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <redirdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </redirdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <channel supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </channel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <crypto supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </crypto>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <interface supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>passt</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </interface>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <panic supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>isa</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>hyperv</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </panic>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <console supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>null</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dev</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pipe</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stdio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>udp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tcp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu-vdagent</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </console>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <gic supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <genid supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backup supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <async-teardown supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <ps2 supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sev supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sgx supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hyperv supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='features'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>relaxed</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vapic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>spinlocks</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vpindex</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>runtime</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>synic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stimer</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reset</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vendor_id</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>frequencies</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reenlightenment</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tlbflush</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ipi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>avic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emsr_bitmap</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>xmm_input</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hyperv>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <launchSecurity supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='sectype'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tdx</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </launchSecurity>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </features>
Nov 22 09:57:02 compute-0 nova_compute[186044]: </domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.110 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 09:57:02 compute-0 nova_compute[186044]: <domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <domain>kvm</domain>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <arch>i686</arch>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <vcpu max='4096'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <iothreads supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <os supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='firmware'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <loader supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>rom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pflash</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='readonly'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>yes</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='secure'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </loader>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </os>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='maximumMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <vendor>AMD</vendor>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='succor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='custom' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-128'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-256'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-512'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 sudo[186720]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <memoryBacking supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='sourceType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>anonymous</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>memfd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </memoryBacking>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <disk supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='diskDevice'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>disk</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cdrom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>floppy</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>lun</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>fdc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>sata</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </disk>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <graphics supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vnc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egl-headless</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </graphics>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <video supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='modelType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vga</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cirrus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>none</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>bochs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ramfb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </video>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hostdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='mode'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>subsystem</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='startupPolicy'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>mandatory</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>requisite</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>optional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='subsysType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pci</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='capsType'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='pciBackend'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hostdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <rng supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>random</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </rng>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <filesystem supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='driverType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>path</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>handle</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtiofs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </filesystem>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <tpm supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-tis</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-crb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emulator</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>external</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendVersion'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>2.0</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </tpm>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <redirdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </redirdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <channel supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </channel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <crypto supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </crypto>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <interface supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>passt</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </interface>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <panic supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>isa</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>hyperv</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </panic>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <console supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>null</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dev</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pipe</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stdio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>udp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tcp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu-vdagent</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </console>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <gic supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <genid supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backup supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <async-teardown supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <ps2 supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sev supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sgx supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hyperv supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='features'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>relaxed</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vapic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>spinlocks</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vpindex</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>runtime</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>synic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stimer</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reset</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vendor_id</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>frequencies</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reenlightenment</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tlbflush</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ipi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>avic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emsr_bitmap</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>xmm_input</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hyperv>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <launchSecurity supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='sectype'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tdx</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </launchSecurity>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </features>
Nov 22 09:57:02 compute-0 nova_compute[186044]: </domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.177 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.183 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 09:57:02 compute-0 nova_compute[186044]: <domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <domain>kvm</domain>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <arch>x86_64</arch>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <vcpu max='240'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <iothreads supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <os supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='firmware'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <loader supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>rom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pflash</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='readonly'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>yes</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='secure'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </loader>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </os>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='maximumMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <vendor>AMD</vendor>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='succor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='custom' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-128'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-256'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-512'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <memoryBacking supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='sourceType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>anonymous</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>memfd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </memoryBacking>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <disk supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='diskDevice'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>disk</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cdrom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>floppy</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>lun</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ide</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>fdc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>sata</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </disk>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <graphics supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vnc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egl-headless</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </graphics>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <video supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='modelType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vga</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cirrus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>none</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>bochs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ramfb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </video>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hostdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='mode'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>subsystem</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='startupPolicy'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>mandatory</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>requisite</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>optional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='subsysType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pci</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='capsType'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='pciBackend'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hostdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <rng supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>random</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </rng>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <filesystem supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='driverType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>path</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>handle</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtiofs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </filesystem>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <tpm supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-tis</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-crb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emulator</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>external</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendVersion'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>2.0</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </tpm>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <redirdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </redirdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <channel supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </channel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <crypto supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </crypto>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <interface supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>passt</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </interface>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <panic supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>isa</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>hyperv</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </panic>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <console supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>null</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dev</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pipe</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stdio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>udp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tcp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu-vdagent</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </console>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <gic supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <genid supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backup supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <async-teardown supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <ps2 supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sev supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sgx supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hyperv supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='features'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>relaxed</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vapic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>spinlocks</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vpindex</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>runtime</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>synic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stimer</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reset</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vendor_id</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>frequencies</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reenlightenment</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tlbflush</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ipi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>avic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emsr_bitmap</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>xmm_input</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hyperv>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <launchSecurity supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='sectype'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tdx</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </launchSecurity>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </features>
Nov 22 09:57:02 compute-0 nova_compute[186044]: </domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.269 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 09:57:02 compute-0 nova_compute[186044]: <domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <domain>kvm</domain>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <arch>x86_64</arch>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <vcpu max='4096'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <iothreads supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <os supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='firmware'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>efi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <loader supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>rom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pflash</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='readonly'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>yes</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='secure'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>yes</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>no</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </loader>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </os>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='maximumMigratable'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>on</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>off</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <vendor>AMD</vendor>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='succor'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <mode name='custom' supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Denverton-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='auto-ibrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amd-psfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='stibp-always-on'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='EPYC-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-128'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-256'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx10-512'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='prefetchiti'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Haswell-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512er'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512pf'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fma4'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tbm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xop'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='amx-tile'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-bf16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-fp16'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bitalg'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrc'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fzrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='la57'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='taa-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xfd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ifma'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cmpccxadd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fbsdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='fsrs'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ibrs-all'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mcdt-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pbrsb-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='psdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='serialize'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vaes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='hle'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='rtm'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512bw'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512cd'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512dq'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512f'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='avx512vl'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='invpcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pcid'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='pku'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='mpx'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='core-capability'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='split-lock-detect'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='cldemote'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='erms'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='gfni'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdir64b'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='movdiri'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='xsaves'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='athlon-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='core2duo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='coreduo-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='n270-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='ss'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <blockers model='phenom-v1'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnow'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <feature name='3dnowext'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </blockers>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </mode>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </cpu>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <memoryBacking supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <enum name='sourceType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>anonymous</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <value>memfd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </memoryBacking>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <disk supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='diskDevice'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>disk</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cdrom</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>floppy</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>lun</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>fdc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>sata</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </disk>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <graphics supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vnc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egl-headless</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </graphics>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <video supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='modelType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vga</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>cirrus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>none</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>bochs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ramfb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </video>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hostdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='mode'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>subsystem</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='startupPolicy'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>mandatory</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>requisite</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>optional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='subsysType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pci</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>scsi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='capsType'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='pciBackend'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hostdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <rng supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtio-non-transitional</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>random</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>egd</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </rng>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <filesystem supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='driverType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>path</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>handle</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>virtiofs</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </filesystem>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <tpm supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-tis</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tpm-crb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emulator</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>external</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendVersion'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>2.0</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </tpm>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <redirdev supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='bus'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>usb</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </redirdev>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <channel supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </channel>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <crypto supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendModel'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>builtin</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </crypto>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <interface supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='backendType'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>default</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>passt</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </interface>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <panic supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='model'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>isa</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>hyperv</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </panic>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <console supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='type'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>null</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vc</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pty</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dev</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>file</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>pipe</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stdio</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>udp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tcp</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>unix</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>qemu-vdagent</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>dbus</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </console>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </devices>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   <features>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <gic supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <genid supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <backup supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <async-teardown supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <ps2 supported='yes'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sev supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <sgx supported='no'/>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <hyperv supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='features'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>relaxed</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vapic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>spinlocks</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vpindex</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>runtime</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>synic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>stimer</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reset</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>vendor_id</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>frequencies</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>reenlightenment</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tlbflush</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>ipi</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>avic</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>emsr_bitmap</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>xmm_input</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </defaults>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </hyperv>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     <launchSecurity supported='yes'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       <enum name='sectype'>
Nov 22 09:57:02 compute-0 nova_compute[186044]:         <value>tdx</value>
Nov 22 09:57:02 compute-0 nova_compute[186044]:       </enum>
Nov 22 09:57:02 compute-0 nova_compute[186044]:     </launchSecurity>
Nov 22 09:57:02 compute-0 nova_compute[186044]:   </features>
Nov 22 09:57:02 compute-0 nova_compute[186044]: </domainCapabilities>
Nov 22 09:57:02 compute-0 nova_compute[186044]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.349 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.349 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.349 186048 DEBUG nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.349 186048 INFO nova.virt.libvirt.host [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Secure Boot support detected
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.352 186048 INFO nova.virt.libvirt.driver [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.352 186048 INFO nova.virt.libvirt.driver [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.364 186048 DEBUG nova.virt.libvirt.driver [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.513 186048 INFO nova.virt.node [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Determined node identity dd02da68-d6c7-4f1a-8710-21abb7ad1703 from /var/lib/nova/compute_id
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.536 186048 WARNING nova.compute.manager [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Compute nodes ['dd02da68-d6c7-4f1a-8710-21abb7ad1703'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.573 186048 INFO nova.compute.manager [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.618 186048 WARNING nova.compute.manager [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.618 186048 DEBUG oslo_concurrency.lockutils [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.619 186048 DEBUG oslo_concurrency.lockutils [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.619 186048 DEBUG oslo_concurrency.lockutils [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.619 186048 DEBUG nova.compute.resource_tracker [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:57:02 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 09:57:02 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 22 09:57:02 compute-0 sudo[186916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lntzvjvfjbysewqoeqmfhijfbdwvipag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805422.4746218-1537-126309252267374/AnsiballZ_systemd.py'
Nov 22 09:57:02 compute-0 sudo[186916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.883 186048 WARNING nova.virt.libvirt.driver [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.884 186048 DEBUG nova.compute.resource_tracker [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6178MB free_disk=73.66306686401367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.885 186048 DEBUG oslo_concurrency.lockutils [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.885 186048 DEBUG oslo_concurrency.lockutils [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.903 186048 WARNING nova.compute.resource_tracker [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] No compute node record for compute-0.ctlplane.example.com:dd02da68-d6c7-4f1a-8710-21abb7ad1703: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host dd02da68-d6c7-4f1a-8710-21abb7ad1703 could not be found.
Nov 22 09:57:02 compute-0 nova_compute[186044]: 2025-11-22 09:57:02.930 186048 INFO nova.compute.resource_tracker [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: dd02da68-d6c7-4f1a-8710-21abb7ad1703
Nov 22 09:57:03 compute-0 nova_compute[186044]: 2025-11-22 09:57:03.008 186048 DEBUG nova.compute.resource_tracker [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:57:03 compute-0 nova_compute[186044]: 2025-11-22 09:57:03.009 186048 DEBUG nova.compute.resource_tracker [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:57:03 compute-0 python3.9[186918]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:57:03 compute-0 systemd[1]: Stopping nova_compute container...
Nov 22 09:57:03 compute-0 nova_compute[186044]: 2025-11-22 09:57:03.249 186048 DEBUG oslo_concurrency.lockutils [None req-6b2f3621-d07a-4ff1-a12c-cb282d31f911 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:57:03 compute-0 nova_compute[186044]: 2025-11-22 09:57:03.250 186048 DEBUG oslo_concurrency.lockutils [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 09:57:03 compute-0 nova_compute[186044]: 2025-11-22 09:57:03.250 186048 DEBUG oslo_concurrency.lockutils [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 09:57:03 compute-0 nova_compute[186044]: 2025-11-22 09:57:03.250 186048 DEBUG oslo_concurrency.lockutils [None req-3845a05c-183b-4cc9-b38b-c9250fd379cc - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 09:57:03 compute-0 virtqemud[186556]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 22 09:57:03 compute-0 virtqemud[186556]: hostname: compute-0
Nov 22 09:57:03 compute-0 virtqemud[186556]: End of file while reading data: Input/output error
Nov 22 09:57:03 compute-0 systemd[1]: libpod-711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123.scope: Deactivated successfully.
Nov 22 09:57:03 compute-0 systemd[1]: libpod-711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123.scope: Consumed 3.069s CPU time.
Nov 22 09:57:03 compute-0 podman[186924]: 2025-11-22 09:57:03.60743998 +0000 UTC m=+0.474578466 container died 711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 09:57:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123-userdata-shm.mount: Deactivated successfully.
Nov 22 09:57:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43-merged.mount: Deactivated successfully.
Nov 22 09:57:04 compute-0 podman[186924]: 2025-11-22 09:57:04.090723871 +0000 UTC m=+0.957862377 container cleanup 711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:57:04 compute-0 podman[186924]: nova_compute
Nov 22 09:57:04 compute-0 podman[186954]: nova_compute
Nov 22 09:57:04 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 22 09:57:04 compute-0 systemd[1]: Stopped nova_compute container.
Nov 22 09:57:04 compute-0 systemd[1]: Starting nova_compute container...
Nov 22 09:57:04 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:57:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e7f1653dec9c07c591cec335ecf4ad6b9e03a53934e37187e960c4d87412e43/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:04 compute-0 podman[186966]: 2025-11-22 09:57:04.427438616 +0000 UTC m=+0.215452885 container init 711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 09:57:04 compute-0 podman[186966]: 2025-11-22 09:57:04.434969679 +0000 UTC m=+0.222983908 container start 711bb2256a425103fa554a92aa26e398a596da483c65e94bac2e94504aba5123 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Nov 22 09:57:04 compute-0 nova_compute[186981]: + sudo -E kolla_set_configs
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Validating config file
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying service configuration files
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /etc/ceph
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Creating directory /etc/ceph
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Writing out command to execute
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:57:04 compute-0 nova_compute[186981]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 09:57:04 compute-0 nova_compute[186981]: ++ cat /run_command
Nov 22 09:57:04 compute-0 nova_compute[186981]: + CMD=nova-compute
Nov 22 09:57:04 compute-0 nova_compute[186981]: + ARGS=
Nov 22 09:57:04 compute-0 nova_compute[186981]: + sudo kolla_copy_cacerts
Nov 22 09:57:04 compute-0 podman[186966]: nova_compute
Nov 22 09:57:04 compute-0 nova_compute[186981]: + [[ ! -n '' ]]
Nov 22 09:57:04 compute-0 nova_compute[186981]: + . kolla_extend_start
Nov 22 09:57:04 compute-0 nova_compute[186981]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 09:57:04 compute-0 nova_compute[186981]: Running command: 'nova-compute'
Nov 22 09:57:04 compute-0 nova_compute[186981]: + umask 0022
Nov 22 09:57:04 compute-0 nova_compute[186981]: + exec nova-compute
Nov 22 09:57:04 compute-0 systemd[1]: Started nova_compute container.
Nov 22 09:57:04 compute-0 sudo[186916]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:05 compute-0 sudo[187142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tahcpmwhzurvntpcsrpwprvbqisffawm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805424.7989008-1546-139594421925365/AnsiballZ_podman_container.py'
Nov 22 09:57:05 compute-0 sudo[187142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:05 compute-0 python3.9[187144]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 09:57:05 compute-0 systemd[1]: Started libpod-conmon-f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa.scope.
Nov 22 09:57:05 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:57:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683a60516dbd3735d38662bb8f5ff3a449d0e7359001a9ce2ea6fec688c8d135/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683a60516dbd3735d38662bb8f5ff3a449d0e7359001a9ce2ea6fec688c8d135/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683a60516dbd3735d38662bb8f5ff3a449d0e7359001a9ce2ea6fec688c8d135/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 22 09:57:06 compute-0 nova_compute[186981]: 2025-11-22 09:57:06.431 186985 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 09:57:06 compute-0 nova_compute[186981]: 2025-11-22 09:57:06.431 186985 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 09:57:06 compute-0 nova_compute[186981]: 2025-11-22 09:57:06.432 186985 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 09:57:06 compute-0 nova_compute[186981]: 2025-11-22 09:57:06.432 186985 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 22 09:57:06 compute-0 podman[187170]: 2025-11-22 09:57:06.492058714 +0000 UTC m=+0.954685661 container init f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:57:06 compute-0 podman[187170]: 2025-11-22 09:57:06.501127788 +0000 UTC m=+0.963754725 container start f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Applying nova statedir ownership
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 22 09:57:06 compute-0 nova_compute_init[187193]: INFO:nova_statedir:Nova statedir ownership complete
Nov 22 09:57:06 compute-0 nova_compute[186981]: 2025-11-22 09:57:06.566 186985 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 09:57:06 compute-0 systemd[1]: libpod-f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa.scope: Deactivated successfully.
Nov 22 09:57:06 compute-0 nova_compute[186981]: 2025-11-22 09:57:06.583 186985 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 09:57:06 compute-0 nova_compute[186981]: 2025-11-22 09:57:06.583 186985 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 22 09:57:06 compute-0 python3.9[187144]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 22 09:57:06 compute-0 podman[187195]: 2025-11-22 09:57:06.735883702 +0000 UTC m=+0.153045950 container died f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.068 186985 INFO nova.virt.driver [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 22 09:57:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa-userdata-shm.mount: Deactivated successfully.
Nov 22 09:57:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-683a60516dbd3735d38662bb8f5ff3a449d0e7359001a9ce2ea6fec688c8d135-merged.mount: Deactivated successfully.
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.166 186985 INFO nova.compute.provider_config [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.182 186985 DEBUG oslo_concurrency.lockutils [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.182 186985 DEBUG oslo_concurrency.lockutils [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.182 186985 DEBUG oslo_concurrency.lockutils [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.183 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.183 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.183 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.183 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.183 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.183 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.183 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.184 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.185 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.185 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.185 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.185 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.185 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.185 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.185 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.186 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.186 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.186 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.186 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.186 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.186 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.186 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.187 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.187 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.187 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.187 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.187 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.187 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.187 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.188 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.188 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.188 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.188 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.188 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.188 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.189 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.189 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.189 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.189 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.189 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.189 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.189 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.190 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.190 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.190 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.190 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.190 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.190 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.191 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.191 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.191 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.191 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.191 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.191 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.191 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.192 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.192 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.192 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.192 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.192 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.192 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.192 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.193 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.193 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.193 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.193 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.193 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.193 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.193 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.194 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.195 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.195 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.195 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.195 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.195 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.195 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.195 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.196 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.197 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.197 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.197 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.197 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.197 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.197 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.197 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.198 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.198 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.198 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.198 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.198 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.198 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.198 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.199 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.200 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.200 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.200 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.200 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.200 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.200 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.200 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.201 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.202 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.203 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.203 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.203 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.203 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.203 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.203 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.203 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.204 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.204 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.204 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.204 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.204 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.204 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.204 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.205 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.205 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.205 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.205 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.205 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.206 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.206 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.206 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.206 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.206 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.206 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.206 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.207 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.207 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.207 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.207 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.207 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.207 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.207 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.208 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.208 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.208 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.208 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.208 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.208 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.208 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.209 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.209 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.209 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.209 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.209 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.210 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.210 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.210 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.210 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.210 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.210 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.211 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.211 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.211 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.211 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.211 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.212 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.212 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.212 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.212 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.212 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.212 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.213 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.214 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.215 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.215 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.215 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.215 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.215 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.215 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.215 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.216 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.216 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.216 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.216 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.216 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.216 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.217 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.217 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.217 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.217 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.217 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.218 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.218 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.218 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.218 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.218 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.218 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.219 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.219 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.219 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.219 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.219 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.220 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.220 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.220 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.220 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.220 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.221 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.221 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.221 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.221 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.221 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.221 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.222 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.223 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.223 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.223 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.223 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.223 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.223 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.224 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.224 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.224 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.224 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.224 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.225 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.225 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.225 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.225 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.225 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.225 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.226 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.226 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.226 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.226 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.226 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.227 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.227 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.227 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.227 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.227 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.228 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.228 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.228 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.228 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.228 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.228 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.229 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.229 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.229 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.229 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.229 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.229 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.229 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.230 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.230 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.230 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.230 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.230 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.230 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.231 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.231 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.231 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.231 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.231 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.231 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.231 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.232 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.233 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.233 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.233 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.233 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.233 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.233 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.234 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.234 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.234 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.234 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.234 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.234 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.235 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.235 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.235 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.235 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.235 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.235 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.235 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.236 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.236 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.236 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.236 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.236 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.236 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.237 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.237 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.237 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.237 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.237 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.237 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.237 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.238 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.239 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.239 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.239 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.239 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.239 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.239 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.239 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.240 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.240 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.240 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.240 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.240 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.241 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.241 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.241 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.241 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.241 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.241 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.241 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.242 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.242 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.242 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.242 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.242 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.242 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.242 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.243 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.243 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.243 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.243 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.243 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.243 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.244 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.244 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.244 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.244 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.244 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.244 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.244 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.245 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.245 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.245 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.245 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.245 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.245 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.246 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.246 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.246 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.246 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.246 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.246 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.246 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.247 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.247 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.247 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.247 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.247 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.247 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.247 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.248 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.248 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.248 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.248 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.248 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.248 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.248 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.249 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.249 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.249 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.249 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.249 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.249 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.249 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.250 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.250 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.250 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.250 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.250 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.250 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.250 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.251 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.251 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.251 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.251 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.251 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.251 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.251 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.252 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.253 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.253 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.253 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.253 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.253 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.253 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.253 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.254 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.254 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.254 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.254 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.254 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.254 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.254 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.255 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.255 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.255 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.255 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.255 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.255 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.255 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.256 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.256 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.256 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.256 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.256 186985 WARNING oslo_config.cfg [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 09:57:07 compute-0 nova_compute[186981]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 09:57:07 compute-0 nova_compute[186981]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 09:57:07 compute-0 nova_compute[186981]: and ``live_migration_inbound_addr`` respectively.
Nov 22 09:57:07 compute-0 nova_compute[186981]: ).  Its value may be silently ignored in the future.
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.256 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.257 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.257 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.257 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.257 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.257 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.257 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.257 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.258 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.258 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.258 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.258 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.258 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.258 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.258 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.259 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.259 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.259 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.259 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.259 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.259 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.259 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.260 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.260 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.260 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.260 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.260 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.260 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.260 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.261 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.261 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.261 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.261 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.261 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.262 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.262 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.262 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.262 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.262 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.262 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.263 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.263 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.263 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.263 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.263 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.263 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.263 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.264 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.264 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.264 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.264 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.264 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.264 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.264 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.265 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.265 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.265 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.265 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.265 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.265 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.265 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.266 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.266 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.266 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.266 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.266 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.266 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.266 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.267 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.267 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.267 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.267 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.267 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.267 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.267 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.268 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.268 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.268 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.268 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.268 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.268 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.268 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.269 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.269 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.269 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.269 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.269 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.269 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.269 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.270 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.270 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.270 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.270 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.270 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.270 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.271 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.271 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.271 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.271 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.271 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.271 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.271 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.272 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.272 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.272 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.272 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.272 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.272 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.272 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.273 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.273 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.273 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.273 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.273 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.273 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.274 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.275 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.275 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.275 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.275 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.275 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.275 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.275 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.276 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.276 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.276 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.276 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.276 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.276 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.276 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.277 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.277 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.277 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.277 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.277 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.277 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.278 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.279 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.279 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.279 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.279 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.279 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.279 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.279 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.280 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.280 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.280 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.280 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.280 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.280 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.280 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.281 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.281 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.281 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.281 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.281 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.281 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.281 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.282 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.282 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.282 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.282 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.282 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.282 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.282 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.283 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.283 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.283 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.283 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.283 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.283 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.284 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.284 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.284 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.284 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.284 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.284 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.284 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.285 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.285 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.285 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.285 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.285 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.285 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.286 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.287 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.288 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.288 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.288 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.288 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.288 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.288 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.289 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.289 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.289 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.289 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.289 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.289 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.289 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.290 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.291 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.292 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.292 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.292 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.292 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.292 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.292 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.293 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.293 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.293 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.293 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.293 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.293 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.294 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.295 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.295 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.295 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.295 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.295 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.295 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.295 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.296 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.296 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.296 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.296 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.296 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.296 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.296 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.297 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.297 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.297 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.297 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.297 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.297 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.297 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.298 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.298 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.298 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.298 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.298 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.298 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.299 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.299 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.299 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.299 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.299 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.299 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.300 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.300 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.300 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.300 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.300 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.300 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.300 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.301 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.301 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.301 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.301 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.301 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.301 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.301 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.302 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.302 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.302 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.302 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.302 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.302 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.302 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.303 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.303 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.303 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.303 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.303 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.303 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.303 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.304 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.304 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.304 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.304 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.304 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.304 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.304 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.305 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.305 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.305 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.305 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.305 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.305 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.306 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.306 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.306 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.306 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.306 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.306 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.306 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.307 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.307 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.307 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.307 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.307 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.307 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.307 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.308 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.308 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.308 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.308 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.308 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.308 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.308 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.309 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.309 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.309 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.309 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.309 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.309 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.309 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.310 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.310 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.310 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.310 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.310 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.310 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.310 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.311 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.311 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.311 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.311 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.311 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.311 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.312 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.312 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.312 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.312 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.312 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.312 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.312 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.313 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.313 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.313 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.313 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.313 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.313 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.313 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.314 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.314 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.314 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.314 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.314 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.314 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.315 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.315 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.315 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.315 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.315 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.315 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.315 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.316 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.316 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.316 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.316 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.316 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.316 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.316 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.317 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.317 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.317 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.317 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.317 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.317 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.317 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.318 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.318 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.318 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.318 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.318 186985 DEBUG oslo_service.service [None req-1c3efe33-3e38-410f-9bc2-6041a05b0926 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.319 186985 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 22 09:57:07 compute-0 podman[187195]: 2025-11-22 09:57:07.326967191 +0000 UTC m=+0.744129419 container cleanup f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:57:07 compute-0 systemd[1]: libpod-conmon-f65c519806aa68cbad2c5b5acc9006b59f74c5ef19220a070c88dd43166baaaa.scope: Deactivated successfully.
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.342 186985 INFO nova.virt.node [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Determined node identity dd02da68-d6c7-4f1a-8710-21abb7ad1703 from /var/lib/nova/compute_id
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.343 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.344 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.344 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.344 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.362 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fe8d07f1e50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.365 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fe8d07f1e50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.366 186985 INFO nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Connection event '1' reason 'None'
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.373 186985 INFO nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]: 
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <host>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <uuid>451f2cde-49d1-45fa-bcb2-7147a4a4b091</uuid>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <arch>x86_64</arch>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model>EPYC-Rome-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <vendor>AMD</vendor>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <microcode version='16777317'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <signature family='23' model='49' stepping='0'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='x2apic'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='tsc-deadline'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='osxsave'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='hypervisor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='tsc_adjust'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='spec-ctrl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='stibp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='arch-capabilities'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='cmp_legacy'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='topoext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='virt-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='lbrv'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='tsc-scale'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='vmcb-clean'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='pause-filter'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='pfthreshold'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='svme-addr-chk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='rdctl-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='skip-l1dfl-vmentry'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='mds-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature name='pschange-mc-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <pages unit='KiB' size='4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <pages unit='KiB' size='2048'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <pages unit='KiB' size='1048576'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <power_management>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <suspend_mem/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <suspend_disk/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <suspend_hybrid/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </power_management>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <iommu support='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <migration_features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <live/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <uri_transports>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <uri_transport>tcp</uri_transport>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <uri_transport>rdma</uri_transport>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </uri_transports>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </migration_features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <topology>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <cells num='1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <cell id='0'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           <memory unit='KiB'>7864320</memory>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           <pages unit='KiB' size='2048'>0</pages>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           <distances>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <sibling id='0' value='10'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           </distances>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           <cpus num='8'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:           </cpus>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         </cell>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </cells>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </topology>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <cache>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </cache>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <secmodel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model>selinux</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <doi>0</doi>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </secmodel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <secmodel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model>dac</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <doi>0</doi>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </secmodel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </host>
Nov 22 09:57:07 compute-0 nova_compute[186981]: 
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <guest>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <os_type>hvm</os_type>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <arch name='i686'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <wordsize>32</wordsize>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <domain type='qemu'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <domain type='kvm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </arch>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <pae/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <nonpae/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <acpi default='on' toggle='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <apic default='on' toggle='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <cpuselection/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <deviceboot/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <disksnapshot default='on' toggle='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <externalSnapshot/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </guest>
Nov 22 09:57:07 compute-0 nova_compute[186981]: 
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <guest>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <os_type>hvm</os_type>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <arch name='x86_64'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <wordsize>64</wordsize>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <domain type='qemu'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <domain type='kvm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </arch>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <acpi default='on' toggle='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <apic default='on' toggle='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <cpuselection/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <deviceboot/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <disksnapshot default='on' toggle='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <externalSnapshot/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </guest>
Nov 22 09:57:07 compute-0 nova_compute[186981]: 
Nov 22 09:57:07 compute-0 nova_compute[186981]: </capabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]: 
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.381 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.383 186985 DEBUG nova.virt.libvirt.volume.mount [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.389 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 09:57:07 compute-0 nova_compute[186981]: <domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <domain>kvm</domain>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <arch>i686</arch>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <vcpu max='4096'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <iothreads supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <os supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='firmware'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <loader supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>rom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pflash</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='readonly'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>yes</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='secure'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </loader>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </os>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='maximumMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <vendor>AMD</vendor>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='succor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='custom' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-128'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-256'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-512'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 sudo[187142]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <memoryBacking supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='sourceType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>anonymous</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>memfd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </memoryBacking>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <disk supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='diskDevice'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>disk</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cdrom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>floppy</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>lun</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>fdc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>sata</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </disk>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <graphics supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vnc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egl-headless</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </graphics>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <video supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='modelType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vga</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cirrus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>none</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>bochs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ramfb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </video>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hostdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='mode'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>subsystem</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='startupPolicy'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>mandatory</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>requisite</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>optional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='subsysType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pci</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='capsType'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='pciBackend'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hostdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <rng supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>random</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </rng>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <filesystem supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='driverType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>path</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>handle</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtiofs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </filesystem>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <tpm supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-tis</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-crb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emulator</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>external</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendVersion'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>2.0</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </tpm>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <redirdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </redirdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <channel supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </channel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <crypto supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </crypto>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <interface supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>passt</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </interface>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <panic supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>isa</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>hyperv</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </panic>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <console supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>null</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dev</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pipe</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stdio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>udp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tcp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu-vdagent</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </console>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <gic supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <genid supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backup supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <async-teardown supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <ps2 supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sev supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sgx supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hyperv supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='features'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>relaxed</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vapic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>spinlocks</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vpindex</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>runtime</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>synic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stimer</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reset</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vendor_id</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>frequencies</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reenlightenment</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tlbflush</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ipi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>avic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emsr_bitmap</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>xmm_input</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hyperv>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <launchSecurity supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='sectype'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tdx</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </launchSecurity>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </features>
Nov 22 09:57:07 compute-0 nova_compute[186981]: </domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.399 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 09:57:07 compute-0 nova_compute[186981]: <domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <domain>kvm</domain>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <arch>i686</arch>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <vcpu max='240'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <iothreads supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <os supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='firmware'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <loader supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>rom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pflash</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='readonly'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>yes</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='secure'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </loader>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </os>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='maximumMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <vendor>AMD</vendor>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='succor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='custom' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-128'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-256'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-512'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <memoryBacking supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='sourceType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>anonymous</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>memfd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </memoryBacking>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <disk supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='diskDevice'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>disk</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cdrom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>floppy</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>lun</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ide</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>fdc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>sata</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </disk>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <graphics supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vnc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egl-headless</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </graphics>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <video supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='modelType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vga</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cirrus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>none</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>bochs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ramfb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </video>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hostdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='mode'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>subsystem</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='startupPolicy'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>mandatory</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>requisite</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>optional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='subsysType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pci</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='capsType'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='pciBackend'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hostdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <rng supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>random</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </rng>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <filesystem supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='driverType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>path</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>handle</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtiofs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </filesystem>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <tpm supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-tis</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-crb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emulator</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>external</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendVersion'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>2.0</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </tpm>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <redirdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </redirdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <channel supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </channel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <crypto supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </crypto>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <interface supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>passt</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </interface>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <panic supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>isa</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>hyperv</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </panic>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <console supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>null</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dev</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pipe</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stdio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>udp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tcp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu-vdagent</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </console>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <gic supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <genid supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backup supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <async-teardown supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <ps2 supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sev supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sgx supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hyperv supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='features'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>relaxed</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vapic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>spinlocks</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vpindex</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>runtime</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>synic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stimer</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reset</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vendor_id</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>frequencies</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reenlightenment</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tlbflush</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ipi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>avic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emsr_bitmap</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>xmm_input</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hyperv>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <launchSecurity supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='sectype'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tdx</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </launchSecurity>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </features>
Nov 22 09:57:07 compute-0 nova_compute[186981]: </domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.431 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.437 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 09:57:07 compute-0 nova_compute[186981]: <domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <domain>kvm</domain>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <arch>x86_64</arch>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <vcpu max='4096'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <iothreads supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <os supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='firmware'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>efi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <loader supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>rom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pflash</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='readonly'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>yes</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='secure'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>yes</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </loader>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </os>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='maximumMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <vendor>AMD</vendor>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='succor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='custom' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-128'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-256'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-512'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <memoryBacking supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='sourceType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>anonymous</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>memfd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </memoryBacking>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <disk supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='diskDevice'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>disk</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cdrom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>floppy</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>lun</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>fdc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>sata</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </disk>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <graphics supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vnc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egl-headless</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </graphics>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <video supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='modelType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vga</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cirrus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>none</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>bochs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ramfb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </video>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hostdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='mode'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>subsystem</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='startupPolicy'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>mandatory</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>requisite</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>optional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='subsysType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pci</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='capsType'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='pciBackend'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hostdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <rng supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>random</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </rng>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <filesystem supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='driverType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>path</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>handle</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtiofs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </filesystem>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <tpm supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-tis</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-crb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emulator</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>external</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendVersion'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>2.0</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </tpm>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <redirdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </redirdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <channel supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </channel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <crypto supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </crypto>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <interface supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>passt</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </interface>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <panic supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>isa</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>hyperv</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </panic>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <console supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>null</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dev</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pipe</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stdio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>udp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tcp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu-vdagent</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </console>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <gic supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <genid supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backup supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <async-teardown supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <ps2 supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sev supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sgx supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hyperv supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='features'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>relaxed</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vapic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>spinlocks</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vpindex</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>runtime</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>synic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stimer</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reset</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vendor_id</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>frequencies</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reenlightenment</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tlbflush</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ipi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>avic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emsr_bitmap</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>xmm_input</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hyperv>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <launchSecurity supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='sectype'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tdx</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </launchSecurity>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </features>
Nov 22 09:57:07 compute-0 nova_compute[186981]: </domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.516 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 09:57:07 compute-0 nova_compute[186981]: <domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <domain>kvm</domain>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <arch>x86_64</arch>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <vcpu max='240'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <iothreads supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <os supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='firmware'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <loader supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>rom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pflash</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='readonly'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>yes</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='secure'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>no</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </loader>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </os>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-passthrough' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='hostPassthroughMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='maximum' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='maximumMigratable'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>on</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>off</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='host-model' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <vendor>AMD</vendor>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='x2apic'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='hypervisor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='stibp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='overflow-recov'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='succor'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lbrv'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='tsc-scale'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='flushbyasid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pause-filter'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='pfthreshold'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <feature policy='disable' name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <mode name='custom' supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Broadwell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Cooperlake-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Denverton-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Dhyana-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='auto-ibrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Milan-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amd-psfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='no-nested-data-bp'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='null-sel-clr-base'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='stibp-always-on'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-Rome-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='EPYC-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='GraniteRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-128'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-256'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx10-512'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='prefetchiti'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Haswell-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v6'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Icelake-Server-v7'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='IvyBridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='KnightsMill-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4fmaps'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-4vnniw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512er'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512pf'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G4-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Opteron_G5-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fma4'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tbm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xop'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SapphireRapids-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='amx-tile'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-bf16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-fp16'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512-vpopcntdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bitalg'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vbmi2'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrc'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fzrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='la57'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='taa-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='tsx-ldtrk'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xfd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='SierraForest-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ifma'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-ne-convert'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx-vnni-int8'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='bus-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cmpccxadd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fbsdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='fsrs'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ibrs-all'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mcdt-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pbrsb-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='psdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='sbdr-ssdp-no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='serialize'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vaes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='vpclmulqdq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Client-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='hle'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='rtm'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Skylake-Server-v5'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512bw'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512cd'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512dq'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512f'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='avx512vl'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='invpcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pcid'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='pku'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='mpx'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v2'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v3'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='core-capability'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='split-lock-detect'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='Snowridge-v4'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='cldemote'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='erms'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='gfni'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdir64b'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='movdiri'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='xsaves'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='athlon-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='core2duo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='coreduo-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='n270-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='ss'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <blockers model='phenom-v1'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnow'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <feature name='3dnowext'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </blockers>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </mode>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </cpu>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <memoryBacking supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <enum name='sourceType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>anonymous</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <value>memfd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </memoryBacking>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <disk supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='diskDevice'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>disk</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cdrom</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>floppy</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>lun</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ide</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>fdc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>sata</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </disk>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <graphics supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vnc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egl-headless</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </graphics>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <video supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='modelType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vga</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>cirrus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>none</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>bochs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ramfb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </video>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hostdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='mode'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>subsystem</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='startupPolicy'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>mandatory</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>requisite</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>optional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='subsysType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pci</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>scsi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='capsType'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='pciBackend'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hostdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <rng supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtio-non-transitional</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>random</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>egd</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </rng>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <filesystem supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='driverType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>path</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>handle</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>virtiofs</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </filesystem>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <tpm supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-tis</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tpm-crb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emulator</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>external</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendVersion'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>2.0</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </tpm>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <redirdev supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='bus'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>usb</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </redirdev>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <channel supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </channel>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <crypto supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendModel'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>builtin</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </crypto>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <interface supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='backendType'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>default</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>passt</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </interface>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <panic supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='model'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>isa</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>hyperv</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </panic>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <console supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='type'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>null</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vc</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pty</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dev</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>file</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>pipe</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stdio</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>udp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tcp</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>unix</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>qemu-vdagent</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>dbus</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </console>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </devices>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   <features>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <gic supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <vmcoreinfo supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <genid supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backingStoreInput supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <backup supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <async-teardown supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <ps2 supported='yes'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sev supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <sgx supported='no'/>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <hyperv supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='features'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>relaxed</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vapic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>spinlocks</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vpindex</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>runtime</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>synic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>stimer</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reset</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>vendor_id</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>frequencies</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>reenlightenment</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tlbflush</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>ipi</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>avic</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>emsr_bitmap</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>xmm_input</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <spinlocks>4095</spinlocks>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <stimer_direct>on</stimer_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </defaults>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </hyperv>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     <launchSecurity supported='yes'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       <enum name='sectype'>
Nov 22 09:57:07 compute-0 nova_compute[186981]:         <value>tdx</value>
Nov 22 09:57:07 compute-0 nova_compute[186981]:       </enum>
Nov 22 09:57:07 compute-0 nova_compute[186981]:     </launchSecurity>
Nov 22 09:57:07 compute-0 nova_compute[186981]:   </features>
Nov 22 09:57:07 compute-0 nova_compute[186981]: </domainCapabilities>
Nov 22 09:57:07 compute-0 nova_compute[186981]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.593 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.594 186985 INFO nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Secure Boot support detected
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.595 186985 INFO nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.595 186985 INFO nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.604 186985 DEBUG nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.634 186985 INFO nova.virt.node [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Determined node identity dd02da68-d6c7-4f1a-8710-21abb7ad1703 from /var/lib/nova/compute_id
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.653 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Verified node dd02da68-d6c7-4f1a-8710-21abb7ad1703 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 22 09:57:07 compute-0 nova_compute[186981]: 2025-11-22 09:57:07.681 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 22 09:57:07 compute-0 sshd-session[158847]: Connection closed by 192.168.122.30 port 44782
Nov 22 09:57:07 compute-0 sshd-session[158844]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:57:07 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 22 09:57:07 compute-0 systemd[1]: session-23.scope: Consumed 1min 57.216s CPU time.
Nov 22 09:57:07 compute-0 systemd-logind[819]: Session 23 logged out. Waiting for processes to exit.
Nov 22 09:57:07 compute-0 systemd-logind[819]: Removed session 23.
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.117 186985 ERROR nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Could not retrieve compute node resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dd02da68-d6c7-4f1a-8710-21abb7ad1703' not found: No resource provider with uuid dd02da68-d6c7-4f1a-8710-21abb7ad1703 found  ", "request_id": "req-2105c13e-1d67-467c-a479-4e7d3ae6ec52"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dd02da68-d6c7-4f1a-8710-21abb7ad1703' not found: No resource provider with uuid dd02da68-d6c7-4f1a-8710-21abb7ad1703 found  ", "request_id": "req-2105c13e-1d67-467c-a479-4e7d3ae6ec52"}]}
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.136 186985 DEBUG oslo_concurrency.lockutils [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.137 186985 DEBUG oslo_concurrency.lockutils [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.137 186985 DEBUG oslo_concurrency.lockutils [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.137 186985 DEBUG nova.compute.resource_tracker [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.274 186985 WARNING nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.276 186985 DEBUG nova.compute.resource_tracker [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6196MB free_disk=73.66362762451172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.276 186985 DEBUG oslo_concurrency.lockutils [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.277 186985 DEBUG oslo_concurrency.lockutils [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.446 186985 ERROR nova.compute.resource_tracker [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dd02da68-d6c7-4f1a-8710-21abb7ad1703' not found: No resource provider with uuid dd02da68-d6c7-4f1a-8710-21abb7ad1703 found  ", "request_id": "req-897c083e-0231-424e-b8f1-506410a067a6"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dd02da68-d6c7-4f1a-8710-21abb7ad1703' not found: No resource provider with uuid dd02da68-d6c7-4f1a-8710-21abb7ad1703 found  ", "request_id": "req-897c083e-0231-424e-b8f1-506410a067a6"}]}
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.447 186985 DEBUG nova.compute.resource_tracker [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.447 186985 DEBUG nova.compute.resource_tracker [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.872 186985 INFO nova.scheduler.client.report [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [req-2c69c2e3-a19e-40ea-b8f8-42e4da349e68] Created resource provider record via placement API for resource provider with UUID dd02da68-d6c7-4f1a-8710-21abb7ad1703 and name compute-0.ctlplane.example.com.
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.913 186985 DEBUG nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 22 09:57:08 compute-0 nova_compute[186981]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.914 186985 INFO nova.virt.libvirt.host [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] kernel doesn't support AMD SEV
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.914 186985 DEBUG nova.compute.provider_tree [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 09:57:08 compute-0 nova_compute[186981]: 2025-11-22 09:57:08.915 186985 DEBUG nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.094 186985 DEBUG nova.scheduler.client.report [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Updated inventory for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.095 186985 DEBUG nova.compute.provider_tree [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Updating resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.095 186985 DEBUG nova.compute.provider_tree [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.260 186985 DEBUG nova.compute.provider_tree [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Updating resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.294 186985 DEBUG nova.compute.resource_tracker [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.294 186985 DEBUG oslo_concurrency.lockutils [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.294 186985 DEBUG nova.service [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.402 186985 DEBUG nova.service [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 22 09:57:09 compute-0 nova_compute[186981]: 2025-11-22 09:57:09.403 186985 DEBUG nova.servicegroup.drivers.db [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 22 09:57:13 compute-0 sshd-session[187279]: Accepted publickey for zuul from 192.168.122.30 port 33972 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 09:57:13 compute-0 systemd-logind[819]: New session 25 of user zuul.
Nov 22 09:57:13 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 22 09:57:13 compute-0 sshd-session[187279]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:57:14 compute-0 python3.9[187432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 09:57:15 compute-0 sudo[187586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgxnmqirqhoawgzroprxssqkphetqqpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805435.2083366-36-227008336443641/AnsiballZ_systemd_service.py'
Nov 22 09:57:15 compute-0 sudo[187586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:16 compute-0 python3.9[187588]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:57:16 compute-0 systemd[1]: Reloading.
Nov 22 09:57:16 compute-0 systemd-sysv-generator[187616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:57:16 compute-0 systemd-rc-local-generator[187610]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:57:16 compute-0 sudo[187586]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:16 compute-0 podman[187648]: 2025-11-22 09:57:16.644279961 +0000 UTC m=+0.103827673 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:57:17 compute-0 python3.9[187800]: ansible-ansible.builtin.service_facts Invoked
Nov 22 09:57:17 compute-0 network[187817]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 09:57:17 compute-0 network[187818]: 'network-scripts' will be removed from distribution in near future.
Nov 22 09:57:17 compute-0 network[187819]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 09:57:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:57:17.921 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:57:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:57:17.922 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:57:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:57:17.922 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:57:20 compute-0 nova_compute[186981]: 2025-11-22 09:57:20.404 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:57:20 compute-0 nova_compute[186981]: 2025-11-22 09:57:20.450 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:57:22 compute-0 podman[188065]: 2025-11-22 09:57:22.03289179 +0000 UTC m=+0.057157470 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:57:22 compute-0 sudo[188104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlqdcqweoywjryroutopzbpcnchflxmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805441.7182956-55-149429888021686/AnsiballZ_systemd_service.py'
Nov 22 09:57:22 compute-0 sudo[188104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:22 compute-0 python3.9[188112]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:57:22 compute-0 sudo[188104]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:23 compute-0 sudo[188263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqbekacnugfvleqkzchjqrrpsxlgadkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805442.705868-65-271131372972103/AnsiballZ_file.py'
Nov 22 09:57:23 compute-0 sudo[188263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:23 compute-0 python3.9[188265]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:23 compute-0 sudo[188263]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:23 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 09:57:23 compute-0 sudo[188416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veumapbvjustjnjxnhlxekpyjhhyedtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805443.533542-73-222447410823978/AnsiballZ_file.py'
Nov 22 09:57:23 compute-0 sudo[188416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:24 compute-0 python3.9[188418]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:24 compute-0 sudo[188416]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:24 compute-0 sudo[188568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlihhkpjicpjybriudtptksdqdsmtjgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805444.3178232-82-266505118050275/AnsiballZ_command.py'
Nov 22 09:57:24 compute-0 sudo[188568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:24 compute-0 python3.9[188570]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:57:24 compute-0 sudo[188568]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:25 compute-0 python3.9[188722]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 09:57:26 compute-0 sudo[188872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erirqvlmosszenkwpvhghgjnuzcltncj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805445.971835-100-149816045240624/AnsiballZ_systemd_service.py'
Nov 22 09:57:26 compute-0 sudo[188872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:26 compute-0 python3.9[188874]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:57:26 compute-0 systemd[1]: Reloading.
Nov 22 09:57:26 compute-0 systemd-rc-local-generator[188902]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:57:26 compute-0 systemd-sysv-generator[188905]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:57:26 compute-0 sudo[188872]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:26 compute-0 podman[188910]: 2025-11-22 09:57:26.951465041 +0000 UTC m=+0.070151821 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS)
Nov 22 09:57:27 compute-0 sudo[189078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fszttflxhmsxrposwjkxksyagqbuuyyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805447.0351903-108-47521086102025/AnsiballZ_command.py'
Nov 22 09:57:27 compute-0 sudo[189078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:27 compute-0 python3.9[189080]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:57:27 compute-0 sudo[189078]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:28 compute-0 sudo[189231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fljsqdbljxlahyyotqxdigzgnzpnmfqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805447.8125782-117-164717884663933/AnsiballZ_file.py'
Nov 22 09:57:28 compute-0 sudo[189231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:28 compute-0 python3.9[189233]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:57:28 compute-0 sudo[189231]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:29 compute-0 python3.9[189383]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:57:29 compute-0 python3.9[189535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:30 compute-0 python3.9[189656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805449.2640758-133-15980140383077/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:57:31 compute-0 sudo[189806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtlzrvgwvfcfwnbvdaznvufxmjficqco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805450.5838735-148-167544040555994/AnsiballZ_group.py'
Nov 22 09:57:31 compute-0 sudo[189806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:31 compute-0 python3.9[189808]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 22 09:57:31 compute-0 sudo[189806]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:31 compute-0 sudo[189958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtzkqbjrjxqvyfaneapxhlzoehotppdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805451.5194814-159-233228352371431/AnsiballZ_getent.py'
Nov 22 09:57:31 compute-0 sudo[189958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:32 compute-0 python3.9[189960]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 22 09:57:32 compute-0 sudo[189958]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:32 compute-0 sudo[190111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bswnibjhsydcjdqqshgajrqispboxjia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805452.3890686-167-11845566513514/AnsiballZ_group.py'
Nov 22 09:57:32 compute-0 sudo[190111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:32 compute-0 python3.9[190113]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 09:57:32 compute-0 groupadd[190114]: group added to /etc/group: name=ceilometer, GID=42405
Nov 22 09:57:32 compute-0 groupadd[190114]: group added to /etc/gshadow: name=ceilometer
Nov 22 09:57:32 compute-0 groupadd[190114]: new group: name=ceilometer, GID=42405
Nov 22 09:57:32 compute-0 sudo[190111]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:33 compute-0 sudo[190269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsqdvekcqfalkvrugznxyhmerylfulqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805453.2203705-175-193220442880455/AnsiballZ_user.py'
Nov 22 09:57:33 compute-0 sudo[190269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:34 compute-0 python3.9[190271]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 09:57:34 compute-0 useradd[190273]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 09:57:34 compute-0 useradd[190273]: add 'ceilometer' to group 'libvirt'
Nov 22 09:57:34 compute-0 useradd[190273]: add 'ceilometer' to shadow group 'libvirt'
Nov 22 09:57:34 compute-0 sudo[190269]: pam_unix(sudo:session): session closed for user root
Nov 22 09:57:35 compute-0 python3.9[190429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:36 compute-0 python3.9[190550]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763805454.9721744-201-233065255386732/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:36 compute-0 python3.9[190701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:37 compute-0 python3.9[190822]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763805456.2498014-201-96464763275170/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:38 compute-0 python3.9[190972]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:38 compute-0 python3.9[191093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763805457.5926669-201-61902796001912/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:39 compute-0 python3.9[191243]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:57:39 compute-0 python3.9[191395]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:57:40 compute-0 python3.9[191547]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:40 compute-0 python3.9[191668]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805459.9001179-260-143883450126758/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:41 compute-0 python3.9[191818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:42 compute-0 python3.9[191894]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:42 compute-0 python3.9[192044]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:43 compute-0 python3.9[192165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805462.204438-260-74565780340373/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:43 compute-0 python3.9[192315]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:44 compute-0 python3.9[192436]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805463.3569918-260-122963424745170/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:44 compute-0 python3.9[192586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:45 compute-0 python3.9[192707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805464.5927095-260-82822505915369/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:46 compute-0 python3.9[192857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:46 compute-0 python3.9[192978]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805465.6423633-260-262286766625750/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:46 compute-0 podman[192979]: 2025-11-22 09:57:46.829456792 +0000 UTC m=+0.101830596 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 09:57:47 compute-0 python3.9[193155]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:47 compute-0 python3.9[193276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805466.8874884-260-13973733401508/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:48 compute-0 python3.9[193426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:49 compute-0 python3.9[193547]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805468.1762033-260-175537068502832/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:50 compute-0 python3.9[193697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:50 compute-0 python3.9[193818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805469.5675259-260-156467185179029/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:51 compute-0 python3.9[193968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:52 compute-0 python3.9[194089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805470.9202259-260-185619989526921/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:52 compute-0 podman[194090]: 2025-11-22 09:57:52.296328513 +0000 UTC m=+0.078196115 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 09:57:54 compute-0 python3.9[194258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:54 compute-0 python3.9[194379]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805472.383865-260-156344216348970/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:55 compute-0 python3.9[194529]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:56 compute-0 python3.9[194605]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:56 compute-0 python3.9[194755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:57 compute-0 podman[194805]: 2025-11-22 09:57:57.232574001 +0000 UTC m=+0.068172724 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 09:57:57 compute-0 python3.9[194845]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:58 compute-0 python3.9[195001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:57:58 compute-0 python3.9[195077]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:59 compute-0 sudo[195227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afgqzlnqylipfssdukvcgcwivwnbdful ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805479.0129895-449-225034523937523/AnsiballZ_file.py'
Nov 22 09:57:59 compute-0 sudo[195227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:57:59 compute-0 python3.9[195229]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:57:59 compute-0 sudo[195227]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:00 compute-0 sudo[195379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdortudgqykcvwgoarbrkcfpwumtsytn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805479.7494266-457-79754335164684/AnsiballZ_file.py'
Nov 22 09:58:00 compute-0 sudo[195379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:00 compute-0 python3.9[195381]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:00 compute-0 sudo[195379]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:00 compute-0 sudo[195531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orkfcuqgqpbacqztwyqbpcwbjqdsxeud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805480.5423021-465-162506383343909/AnsiballZ_file.py'
Nov 22 09:58:00 compute-0 sudo[195531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:01 compute-0 python3.9[195533]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:58:01 compute-0 sudo[195531]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:01 compute-0 sudo[195683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzzzaydumbdvbdxmefuqkrlorahgixrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805481.2530818-473-49086738416871/AnsiballZ_systemd_service.py'
Nov 22 09:58:01 compute-0 sudo[195683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:01 compute-0 python3.9[195685]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:58:01 compute-0 systemd[1]: Reloading.
Nov 22 09:58:02 compute-0 systemd-rc-local-generator[195715]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:02 compute-0 systemd-sysv-generator[195718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:02 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 22 09:58:02 compute-0 sudo[195683]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:02 compute-0 sudo[195874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqiuxverypwnsgnanjqkkaajujatcmmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805482.6277301-482-48600237789019/AnsiballZ_stat.py'
Nov 22 09:58:02 compute-0 sudo[195874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:03 compute-0 python3.9[195876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:58:03 compute-0 sudo[195874]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:03 compute-0 sudo[195997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igfkvhrihsgqwjzpqenyyefmxznhvqyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805482.6277301-482-48600237789019/AnsiballZ_copy.py'
Nov 22 09:58:03 compute-0 sudo[195997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:03 compute-0 python3.9[195999]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805482.6277301-482-48600237789019/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:58:03 compute-0 sudo[195997]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:03 compute-0 sudo[196073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjfjtzcpqgfxyizasimjgofeipxtbwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805482.6277301-482-48600237789019/AnsiballZ_stat.py'
Nov 22 09:58:03 compute-0 sudo[196073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:04 compute-0 python3.9[196075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:58:04 compute-0 sudo[196073]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:04 compute-0 sudo[196196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tllvjoymyxcsyqxgdomokaijqxyssaqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805482.6277301-482-48600237789019/AnsiballZ_copy.py'
Nov 22 09:58:04 compute-0 sudo[196196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:04 compute-0 python3.9[196198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805482.6277301-482-48600237789019/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:58:04 compute-0 sudo[196196]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:05 compute-0 sudo[196348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbhiaszfocqqmcibuzqoqdsbvyqjaqnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805485.040252-510-30340287138333/AnsiballZ_container_config_data.py'
Nov 22 09:58:05 compute-0 sudo[196348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:05 compute-0 python3.9[196350]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 22 09:58:05 compute-0 sudo[196348]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.597 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.597 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.597 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.621 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.621 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.621 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.622 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.622 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.623 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.623 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.623 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.623 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.658 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.659 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.660 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.660 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:58:06 compute-0 sudo[196500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoxkornivhpqsxuqwnytdjcmntcrayut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805486.1055164-519-75623979980993/AnsiballZ_container_config_hash.py'
Nov 22 09:58:06 compute-0 sudo[196500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.879 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.881 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6136MB free_disk=73.66183853149414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.882 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.882 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.965 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:58:06 compute-0 nova_compute[186981]: 2025-11-22 09:58:06.966 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:58:06 compute-0 python3.9[196502]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:58:06 compute-0 sudo[196500]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:07 compute-0 nova_compute[186981]: 2025-11-22 09:58:07.001 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:58:07 compute-0 nova_compute[186981]: 2025-11-22 09:58:07.024 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:58:07 compute-0 nova_compute[186981]: 2025-11-22 09:58:07.026 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:58:07 compute-0 nova_compute[186981]: 2025-11-22 09:58:07.027 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:58:08 compute-0 sudo[196652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhodhjlgvtqzysphsjxcepjqbanylexp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805487.366202-529-177726441088690/AnsiballZ_edpm_container_manage.py'
Nov 22 09:58:08 compute-0 sudo[196652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:08 compute-0 python3[196654]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:58:08 compute-0 podman[196691]: 2025-11-22 09:58:08.639112837 +0000 UTC m=+0.072756757 container create 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 09:58:08 compute-0 podman[196691]: 2025-11-22 09:58:08.606395521 +0000 UTC m=+0.040039551 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 09:58:08 compute-0 python3[196654]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 22 09:58:08 compute-0 sudo[196652]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:09 compute-0 sudo[196879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xboichknjvwgoysxakohozihgdukesfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805489.0499036-537-73012119317568/AnsiballZ_stat.py'
Nov 22 09:58:09 compute-0 sudo[196879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:09 compute-0 python3.9[196881]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:58:09 compute-0 sudo[196879]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:10 compute-0 sudo[197033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjkefpxkklijosdojkvorvrzbxvfrmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805490.0031404-546-128266008884200/AnsiballZ_file.py'
Nov 22 09:58:10 compute-0 sudo[197033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:10 compute-0 python3.9[197035]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:10 compute-0 sudo[197033]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:11 compute-0 sudo[197184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifdlxhktqunozordbnoydxqoseeyoops ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805490.594314-546-215683180624638/AnsiballZ_copy.py'
Nov 22 09:58:11 compute-0 sudo[197184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:11 compute-0 python3.9[197186]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763805490.594314-546-215683180624638/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:11 compute-0 sudo[197184]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:12 compute-0 sudo[197260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqqpjqmonzrmxwlathwlngdzuvbrdmpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805490.594314-546-215683180624638/AnsiballZ_systemd.py'
Nov 22 09:58:12 compute-0 sudo[197260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:12 compute-0 python3.9[197262]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:58:12 compute-0 systemd[1]: Reloading.
Nov 22 09:58:12 compute-0 systemd-rc-local-generator[197290]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:12 compute-0 systemd-sysv-generator[197295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:12 compute-0 sudo[197260]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:13 compute-0 sudo[197372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pngdfrxjrwklckcosqaxsroyhagpqmwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805490.594314-546-215683180624638/AnsiballZ_systemd.py'
Nov 22 09:58:13 compute-0 sudo[197372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:13 compute-0 python3.9[197374]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:58:13 compute-0 systemd[1]: Reloading.
Nov 22 09:58:13 compute-0 systemd-rc-local-generator[197395]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:13 compute-0 systemd-sysv-generator[197402]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:13 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 09:58:13 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:13 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.
Nov 22 09:58:13 compute-0 podman[197414]: 2025-11-22 09:58:13.937909357 +0000 UTC m=+0.171578224 container init 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 09:58:13 compute-0 ceilometer_agent_compute[197429]: + sudo -E kolla_set_configs
Nov 22 09:58:13 compute-0 podman[197414]: 2025-11-22 09:58:13.967265194 +0000 UTC m=+0.200933971 container start 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 09:58:13 compute-0 podman[197414]: ceilometer_agent_compute
Nov 22 09:58:13 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 09:58:13 compute-0 ceilometer_agent_compute[197429]: sudo: unable to send audit message: Operation not permitted
Nov 22 09:58:13 compute-0 sudo[197435]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 09:58:13 compute-0 sudo[197435]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:58:13 compute-0 sudo[197435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 09:58:14 compute-0 sudo[197372]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Validating config file
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Copying service configuration files
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: INFO:__main__:Writing out command to execute
Nov 22 09:58:14 compute-0 sudo[197435]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: ++ cat /run_command
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + ARGS=
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + sudo kolla_copy_cacerts
Nov 22 09:58:14 compute-0 podman[197436]: 2025-11-22 09:58:14.047403927 +0000 UTC m=+0.068330859 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: sudo: unable to send audit message: Operation not permitted
Nov 22 09:58:14 compute-0 sudo[197456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 09:58:14 compute-0 systemd[1]: 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-e090bbdde03fdc6.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 09:58:14 compute-0 sudo[197456]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:58:14 compute-0 sudo[197456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 09:58:14 compute-0 systemd[1]: 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-e090bbdde03fdc6.service: Failed with result 'exit-code'.
Nov 22 09:58:14 compute-0 sudo[197456]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + [[ ! -n '' ]]
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + . kolla_extend_start
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + umask 0022
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 09:58:14 compute-0 sudo[197608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhymnzbhlchsfgklxskcdpnbxsqwzkix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805494.2595544-570-77509872462436/AnsiballZ_systemd.py'
Nov 22 09:58:14 compute-0 sudo[197608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:14 compute-0 python3.9[197610]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.893 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.893 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.894 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.895 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.895 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.895 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.895 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.895 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.895 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.895 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.896 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.897 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.898 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.899 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.900 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.903 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.904 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.905 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.906 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.910 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.910 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.910 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.910 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.910 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.910 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.931 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.933 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.934 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 09:58:14 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 22 09:58:14 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:14.995 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.038 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.096 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.097 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.108 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.108 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.108 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.108 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.108 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.109 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.110 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.111 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.112 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.113 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.114 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.115 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.116 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.117 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.118 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.119 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.120 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.121 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.122 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.123 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.124 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.125 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.126 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.127 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.128 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197429]: 2025-11-22 09:58:15.143 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 22 09:58:15 compute-0 virtqemud[186556]: End of file while reading data: Input/output error
Nov 22 09:58:15 compute-0 systemd[1]: libpod-378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.scope: Deactivated successfully.
Nov 22 09:58:15 compute-0 systemd[1]: libpod-378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.scope: Consumed 1.357s CPU time.
Nov 22 09:58:15 compute-0 podman[197616]: 2025-11-22 09:58:15.303752546 +0000 UTC m=+0.350783830 container died 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 09:58:15 compute-0 systemd[1]: 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-e090bbdde03fdc6.timer: Deactivated successfully.
Nov 22 09:58:15 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.
Nov 22 09:58:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-userdata-shm.mount: Deactivated successfully.
Nov 22 09:58:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473-merged.mount: Deactivated successfully.
Nov 22 09:58:15 compute-0 podman[197616]: 2025-11-22 09:58:15.382461146 +0000 UTC m=+0.429492440 container cleanup 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 09:58:15 compute-0 podman[197616]: ceilometer_agent_compute
Nov 22 09:58:15 compute-0 podman[197648]: ceilometer_agent_compute
Nov 22 09:58:15 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 22 09:58:15 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 22 09:58:15 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 09:58:15 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a51856588e5acbb567a6413bd5de665938040050c6a9071499cf86b0f806473/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.
Nov 22 09:58:15 compute-0 podman[197659]: 2025-11-22 09:58:15.636986093 +0000 UTC m=+0.132709418 container init 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + sudo -E kolla_set_configs
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: sudo: unable to send audit message: Operation not permitted
Nov 22 09:58:15 compute-0 sudo[197680]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 09:58:15 compute-0 sudo[197680]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:58:15 compute-0 sudo[197680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 09:58:15 compute-0 podman[197659]: 2025-11-22 09:58:15.676283542 +0000 UTC m=+0.172006847 container start 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 09:58:15 compute-0 podman[197659]: ceilometer_agent_compute
Nov 22 09:58:15 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Validating config file
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Copying service configuration files
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: INFO:__main__:Writing out command to execute
Nov 22 09:58:15 compute-0 sudo[197680]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: ++ cat /run_command
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + ARGS=
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + sudo kolla_copy_cacerts
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: sudo: unable to send audit message: Operation not permitted
Nov 22 09:58:15 compute-0 sudo[197694]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 09:58:15 compute-0 sudo[197694]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 09:58:15 compute-0 sudo[197608]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:15 compute-0 sudo[197694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 09:58:15 compute-0 sudo[197694]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + [[ ! -n '' ]]
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + . kolla_extend_start
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + umask 0022
Nov 22 09:58:15 compute-0 ceilometer_agent_compute[197674]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 09:58:15 compute-0 podman[197681]: 2025-11-22 09:58:15.76520298 +0000 UTC m=+0.071500250 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 09:58:15 compute-0 systemd[1]: 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-41dc7bcec2963ba5.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 09:58:15 compute-0 systemd[1]: 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-41dc7bcec2963ba5.service: Failed with result 'exit-code'.
Nov 22 09:58:16 compute-0 sudo[197852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdqalqepprceabpptftkmikwdlpjcphs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805495.9263647-578-55496368679833/AnsiballZ_stat.py'
Nov 22 09:58:16 compute-0 sudo[197852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:16 compute-0 python3.9[197854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:58:16 compute-0 sudo[197852]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.584 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.585 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.586 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.586 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.586 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.586 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.586 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.586 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.586 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.587 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.588 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.589 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.590 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.590 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.590 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.590 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.590 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.590 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.590 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.591 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.592 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.593 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.594 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.595 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.596 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.597 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.597 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.597 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.597 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.597 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.597 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.597 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.598 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.599 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.600 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.600 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.600 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.600 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.600 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.600 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.600 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.601 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.602 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.603 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.604 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.605 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.605 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.605 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.605 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.624 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.627 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.628 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.654 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.796 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.796 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.796 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.796 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.797 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.797 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.797 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.797 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.797 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.797 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.798 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.798 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.798 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.798 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.798 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.798 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.799 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.800 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.801 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.802 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.802 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.802 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.802 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.802 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.802 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.802 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.803 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.803 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.803 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.803 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.803 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.803 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.803 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.804 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.804 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.804 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.804 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.804 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.804 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.804 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.805 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.805 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.805 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.805 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.805 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.805 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.805 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.806 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.806 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.806 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.806 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.808 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.809 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.809 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.809 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.809 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.809 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.809 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.809 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.810 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.810 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.810 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.810 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.810 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.810 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.810 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.811 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.811 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.811 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.811 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.811 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.811 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.811 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.812 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.813 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.813 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.813 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.813 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.813 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.813 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.813 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.814 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.814 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.814 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.814 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.814 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.814 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.815 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.816 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.816 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.816 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.816 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.816 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.816 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.817 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.817 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.817 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.817 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.817 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.818 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.819 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.820 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.821 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.821 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.821 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.821 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.821 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.821 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.821 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.822 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.822 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.822 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.822 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.822 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.822 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.822 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.823 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.823 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.823 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.823 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.823 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.824 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.824 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.824 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.824 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.824 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.824 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.824 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.825 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.826 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.826 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.826 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.826 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.826 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.826 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.826 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.828 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.834 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 09:58:16.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:58:16 compute-0 sudo[197982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxomfaqbyvkhypdmyyjbrhvfkuafrwfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805495.9263647-578-55496368679833/AnsiballZ_copy.py'
Nov 22 09:58:16 compute-0 sudo[197982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:16 compute-0 podman[197981]: 2025-11-22 09:58:16.946389884 +0000 UTC m=+0.082644956 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 09:58:17 compute-0 python3.9[197994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805495.9263647-578-55496368679833/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:58:17 compute-0 sudo[197982]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:17 compute-0 sudo[198159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obdxhlzqpmptnerxgvyflewtstknivki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805497.5178385-595-91138515511346/AnsiballZ_container_config_data.py'
Nov 22 09:58:17 compute-0 sudo[198159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:58:17.925 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:58:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:58:17.926 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:58:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:58:17.926 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:58:18 compute-0 python3.9[198161]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 22 09:58:18 compute-0 sudo[198159]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:18 compute-0 sudo[198311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orxwryatesfskrtdupxgzusdirnfgrny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805498.3730092-604-243540261169111/AnsiballZ_container_config_hash.py'
Nov 22 09:58:18 compute-0 sudo[198311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:18 compute-0 python3.9[198313]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:58:18 compute-0 sudo[198311]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:19 compute-0 sudo[198463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjlxzgkqaaeukajxoymswygfkdcjinqf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805499.3008428-614-145748176090207/AnsiballZ_edpm_container_manage.py'
Nov 22 09:58:19 compute-0 sudo[198463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:19 compute-0 python3[198465]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:58:20 compute-0 podman[198501]: 2025-11-22 09:58:20.09238298 +0000 UTC m=+0.028674559 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 09:58:20 compute-0 podman[198501]: 2025-11-22 09:58:20.323218455 +0000 UTC m=+0.259510034 container create 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:58:20 compute-0 python3[198465]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 22 09:58:20 compute-0 sudo[198463]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:21 compute-0 sudo[198687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nukhvhsobolefsycxdlzizybnzxdraoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805500.6936362-622-217477103842243/AnsiballZ_stat.py'
Nov 22 09:58:21 compute-0 sudo[198687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:21 compute-0 python3.9[198689]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:58:21 compute-0 sudo[198687]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:21 compute-0 sudo[198841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuzkkyvnzdnppyrtykxlmholyysaiqir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805501.5487528-631-119283489376686/AnsiballZ_file.py'
Nov 22 09:58:21 compute-0 sudo[198841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:22 compute-0 python3.9[198843]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:22 compute-0 sudo[198841]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:22 compute-0 podman[198917]: 2025-11-22 09:58:22.620931509 +0000 UTC m=+0.064816691 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:58:22 compute-0 sudo[199010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qghlbnppbsvvycoeiqjlpdrmuefkckap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805502.2878954-631-139716022239660/AnsiballZ_copy.py'
Nov 22 09:58:22 compute-0 sudo[199010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:22 compute-0 python3.9[199012]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763805502.2878954-631-139716022239660/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:23 compute-0 sudo[199010]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:23 compute-0 sudo[199086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmcvixnsvpuqwlxitsbwoonldycbetyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805502.2878954-631-139716022239660/AnsiballZ_systemd.py'
Nov 22 09:58:23 compute-0 sudo[199086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:23 compute-0 python3.9[199088]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:58:23 compute-0 systemd[1]: Reloading.
Nov 22 09:58:23 compute-0 systemd-sysv-generator[199121]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:23 compute-0 systemd-rc-local-generator[199116]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:23 compute-0 sudo[199086]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:24 compute-0 sudo[199197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toghmdkdmkevyeeioptenaepibbkvtzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805502.2878954-631-139716022239660/AnsiballZ_systemd.py'
Nov 22 09:58:24 compute-0 sudo[199197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:24 compute-0 python3.9[199199]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:58:24 compute-0 systemd[1]: Reloading.
Nov 22 09:58:24 compute-0 systemd-sysv-generator[199236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:24 compute-0 systemd-rc-local-generator[199232]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:24 compute-0 systemd[1]: Starting node_exporter container...
Nov 22 09:58:25 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f0f6f1a099da0509a0e79307953d2beaa8ff7c52120ac2a246bcd4de4f215a2/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f0f6f1a099da0509a0e79307953d2beaa8ff7c52120ac2a246bcd4de4f215a2/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:25 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.
Nov 22 09:58:25 compute-0 podman[199240]: 2025-11-22 09:58:25.109770177 +0000 UTC m=+0.108325598 container init 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.122Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.122Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.122Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.122Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.122Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.123Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.124Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 09:58:25 compute-0 node_exporter[199255]: ts=2025-11-22T09:58:25.124Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 09:58:25 compute-0 podman[199240]: 2025-11-22 09:58:25.139793917 +0000 UTC m=+0.138349318 container start 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:58:25 compute-0 podman[199240]: node_exporter
Nov 22 09:58:25 compute-0 systemd[1]: Started node_exporter container.
Nov 22 09:58:25 compute-0 sudo[199197]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:25 compute-0 podman[199264]: 2025-11-22 09:58:25.200095617 +0000 UTC m=+0.049537650 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:58:25 compute-0 sudo[199438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqlfgvwqgcyyrqjbvrcbpnrmuusdsydi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805505.357795-655-221274421716792/AnsiballZ_systemd.py'
Nov 22 09:58:25 compute-0 sudo[199438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:25 compute-0 python3.9[199440]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:58:25 compute-0 systemd[1]: Stopping node_exporter container...
Nov 22 09:58:26 compute-0 systemd[1]: libpod-6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.scope: Deactivated successfully.
Nov 22 09:58:26 compute-0 podman[199444]: 2025-11-22 09:58:26.075930888 +0000 UTC m=+0.066970760 container died 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:58:26 compute-0 systemd[1]: 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc-75684714fcb5b7.timer: Deactivated successfully.
Nov 22 09:58:26 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.
Nov 22 09:58:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc-userdata-shm.mount: Deactivated successfully.
Nov 22 09:58:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f0f6f1a099da0509a0e79307953d2beaa8ff7c52120ac2a246bcd4de4f215a2-merged.mount: Deactivated successfully.
Nov 22 09:58:26 compute-0 podman[199444]: 2025-11-22 09:58:26.120207224 +0000 UTC m=+0.111247086 container cleanup 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:58:26 compute-0 podman[199444]: node_exporter
Nov 22 09:58:26 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 09:58:26 compute-0 podman[199471]: node_exporter
Nov 22 09:58:26 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 22 09:58:26 compute-0 systemd[1]: Stopped node_exporter container.
Nov 22 09:58:26 compute-0 systemd[1]: Starting node_exporter container...
Nov 22 09:58:26 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f0f6f1a099da0509a0e79307953d2beaa8ff7c52120ac2a246bcd4de4f215a2/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f0f6f1a099da0509a0e79307953d2beaa8ff7c52120ac2a246bcd4de4f215a2/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:26 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.
Nov 22 09:58:26 compute-0 podman[199484]: 2025-11-22 09:58:26.357992168 +0000 UTC m=+0.144312990 container init 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.373Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.373Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.374Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.374Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.374Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.374Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.374Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.374Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.374Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.375Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 09:58:26 compute-0 node_exporter[199499]: ts=2025-11-22T09:58:26.376Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 09:58:26 compute-0 podman[199484]: 2025-11-22 09:58:26.394867974 +0000 UTC m=+0.181188716 container start 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:58:26 compute-0 podman[199484]: node_exporter
Nov 22 09:58:26 compute-0 systemd[1]: Started node_exporter container.
Nov 22 09:58:26 compute-0 sudo[199438]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:26 compute-0 podman[199509]: 2025-11-22 09:58:26.461247663 +0000 UTC m=+0.057469809 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 09:58:27 compute-0 sudo[199683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnsfsvtfolupfwthkyzlqzqgyizjpvfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805506.6702948-663-230186362253206/AnsiballZ_stat.py'
Nov 22 09:58:27 compute-0 sudo[199683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:27 compute-0 python3.9[199685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:58:27 compute-0 sudo[199683]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:27 compute-0 podman[199756]: 2025-11-22 09:58:27.641261228 +0000 UTC m=+0.080802921 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 09:58:27 compute-0 sudo[199826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klyehqxxfunpbzwlilwukicuroyfjdxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805506.6702948-663-230186362253206/AnsiballZ_copy.py'
Nov 22 09:58:27 compute-0 sudo[199826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:27 compute-0 python3.9[199828]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805506.6702948-663-230186362253206/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:58:27 compute-0 sudo[199826]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:28 compute-0 sudo[199978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhlyckcmxsghcgnmhamqdltmrjnbfzuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805508.2491915-680-215276699926318/AnsiballZ_container_config_data.py'
Nov 22 09:58:28 compute-0 sudo[199978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:28 compute-0 python3.9[199980]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 22 09:58:28 compute-0 sudo[199978]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:29 compute-0 sudo[200130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzkcwvyxhhwexfplmqufkmjqjwyotnzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805509.000822-689-146048876394667/AnsiballZ_container_config_hash.py'
Nov 22 09:58:29 compute-0 sudo[200130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:29 compute-0 python3.9[200132]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:58:29 compute-0 sudo[200130]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:30 compute-0 sudo[200282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efboqwhejgogxifayqhoncmffgeefnfw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805509.9150188-699-146114706905166/AnsiballZ_edpm_container_manage.py'
Nov 22 09:58:30 compute-0 sudo[200282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:30 compute-0 python3[200284]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:58:32 compute-0 podman[200296]: 2025-11-22 09:58:32.035653992 +0000 UTC m=+1.358084987 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 09:58:32 compute-0 podman[200395]: 2025-11-22 09:58:32.238336375 +0000 UTC m=+0.069086960 container create 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Nov 22 09:58:32 compute-0 podman[200395]: 2025-11-22 09:58:32.20597067 +0000 UTC m=+0.036721345 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 09:58:32 compute-0 python3[200284]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 22 09:58:32 compute-0 sudo[200282]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:33 compute-0 sudo[200583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quroabqpcahvtxwjsidytnakoqlqednx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805512.632707-707-148443754077045/AnsiballZ_stat.py'
Nov 22 09:58:33 compute-0 sudo[200583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:33 compute-0 python3.9[200585]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:58:33 compute-0 sudo[200583]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:33 compute-0 sudo[200737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkvphxyurxjjwgulpanoahzjazppxodw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805513.6138349-716-176845918221886/AnsiballZ_file.py'
Nov 22 09:58:33 compute-0 sudo[200737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:34 compute-0 python3.9[200739]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:34 compute-0 sudo[200737]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:34 compute-0 sudo[200888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqpsabkngqheolxglrdscliuaqbymsym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805514.2446022-716-182684102854261/AnsiballZ_copy.py'
Nov 22 09:58:34 compute-0 sudo[200888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:34 compute-0 python3.9[200890]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763805514.2446022-716-182684102854261/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:34 compute-0 sudo[200888]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:35 compute-0 sudo[200964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krcumjctstsfohhhqswttpwbeuxybhrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805514.2446022-716-182684102854261/AnsiballZ_systemd.py'
Nov 22 09:58:35 compute-0 sudo[200964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:35 compute-0 python3.9[200966]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:58:35 compute-0 systemd[1]: Reloading.
Nov 22 09:58:35 compute-0 systemd-rc-local-generator[200994]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:35 compute-0 systemd-sysv-generator[200997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:35 compute-0 sudo[200964]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:36 compute-0 sudo[201075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alapqffyydpzavtbsztdcmalqnammoat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805514.2446022-716-182684102854261/AnsiballZ_systemd.py'
Nov 22 09:58:36 compute-0 sudo[201075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:36 compute-0 python3.9[201077]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:58:36 compute-0 systemd[1]: Reloading.
Nov 22 09:58:36 compute-0 systemd-sysv-generator[201109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:36 compute-0 systemd-rc-local-generator[201103]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:37 compute-0 systemd[1]: Starting podman_exporter container...
Nov 22 09:58:37 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16e87c5afeed0e6ea16f96bf95ad4c3bdb3ef1c9a506baad81fba00496835cc1/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16e87c5afeed0e6ea16f96bf95ad4c3bdb3ef1c9a506baad81fba00496835cc1/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:37 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.
Nov 22 09:58:37 compute-0 podman[201117]: 2025-11-22 09:58:37.236215415 +0000 UTC m=+0.184759534 container init 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:58:37 compute-0 podman_exporter[201133]: ts=2025-11-22T09:58:37.263Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 09:58:37 compute-0 podman_exporter[201133]: ts=2025-11-22T09:58:37.264Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 09:58:37 compute-0 podman_exporter[201133]: ts=2025-11-22T09:58:37.264Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 09:58:37 compute-0 podman_exporter[201133]: ts=2025-11-22T09:58:37.264Z caller=handler.go:105 level=info collector=container
Nov 22 09:58:37 compute-0 podman[201117]: 2025-11-22 09:58:37.275685965 +0000 UTC m=+0.224230044 container start 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:58:37 compute-0 podman[201117]: podman_exporter
Nov 22 09:58:37 compute-0 systemd[1]: Starting Podman API Service...
Nov 22 09:58:37 compute-0 systemd[1]: Started Podman API Service.
Nov 22 09:58:37 compute-0 systemd[1]: Started podman_exporter container.
Nov 22 09:58:37 compute-0 podman[201144]: time="2025-11-22T09:58:37Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 22 09:58:37 compute-0 podman[201144]: time="2025-11-22T09:58:37Z" level=info msg="Setting parallel job count to 25"
Nov 22 09:58:37 compute-0 podman[201144]: time="2025-11-22T09:58:37Z" level=info msg="Using sqlite as database backend"
Nov 22 09:58:37 compute-0 podman[201144]: time="2025-11-22T09:58:37Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 22 09:58:37 compute-0 podman[201144]: time="2025-11-22T09:58:37Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 22 09:58:37 compute-0 podman[201144]: time="2025-11-22T09:58:37Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 22 09:58:37 compute-0 sudo[201075]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:37 compute-0 podman[201144]: @ - - [22/Nov/2025:09:58:37 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 09:58:37 compute-0 podman[201144]: time="2025-11-22T09:58:37Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 09:58:37 compute-0 podman[201142]: 2025-11-22 09:58:37.380028979 +0000 UTC m=+0.085464029 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:58:37 compute-0 systemd[1]: 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b-7cbbbd1c0a9067b6.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 09:58:37 compute-0 systemd[1]: 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b-7cbbbd1c0a9067b6.service: Failed with result 'exit-code'.
Nov 22 09:58:37 compute-0 podman[201144]: @ - - [22/Nov/2025:09:58:37 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19567 "" "Go-http-client/1.1"
Nov 22 09:58:37 compute-0 podman_exporter[201133]: ts=2025-11-22T09:58:37.404Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 09:58:37 compute-0 podman_exporter[201133]: ts=2025-11-22T09:58:37.405Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 09:58:37 compute-0 podman_exporter[201133]: ts=2025-11-22T09:58:37.406Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 09:58:37 compute-0 sudo[201330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqtgxdyvjelohnxhkewonhqgxngnqazh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805517.5677898-740-275848397165708/AnsiballZ_systemd.py'
Nov 22 09:58:37 compute-0 sudo[201330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:38 compute-0 python3.9[201332]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:58:38 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 22 09:58:38 compute-0 podman[201144]: @ - - [22/Nov/2025:09:58:37 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 22 09:58:38 compute-0 systemd[1]: libpod-2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.scope: Deactivated successfully.
Nov 22 09:58:38 compute-0 conmon[201133]: conmon 2513067a521a60ea91f0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.scope/container/memory.events
Nov 22 09:58:38 compute-0 podman[201336]: 2025-11-22 09:58:38.351962533 +0000 UTC m=+0.069012129 container died 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:58:38 compute-0 systemd[1]: 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b-7cbbbd1c0a9067b6.timer: Deactivated successfully.
Nov 22 09:58:38 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.
Nov 22 09:58:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b-userdata-shm.mount: Deactivated successfully.
Nov 22 09:58:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-16e87c5afeed0e6ea16f96bf95ad4c3bdb3ef1c9a506baad81fba00496835cc1-merged.mount: Deactivated successfully.
Nov 22 09:58:38 compute-0 podman[201336]: 2025-11-22 09:58:38.617927117 +0000 UTC m=+0.334976743 container cleanup 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 09:58:38 compute-0 podman[201336]: podman_exporter
Nov 22 09:58:38 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 09:58:38 compute-0 podman[201364]: podman_exporter
Nov 22 09:58:38 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 22 09:58:38 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 22 09:58:38 compute-0 systemd[1]: Starting podman_exporter container...
Nov 22 09:58:38 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16e87c5afeed0e6ea16f96bf95ad4c3bdb3ef1c9a506baad81fba00496835cc1/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16e87c5afeed0e6ea16f96bf95ad4c3bdb3ef1c9a506baad81fba00496835cc1/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.
Nov 22 09:58:38 compute-0 podman[201377]: 2025-11-22 09:58:38.882061112 +0000 UTC m=+0.141572414 container init 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:58:38 compute-0 podman_exporter[201392]: ts=2025-11-22T09:58:38.900Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 09:58:38 compute-0 podman_exporter[201392]: ts=2025-11-22T09:58:38.900Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 09:58:38 compute-0 podman_exporter[201392]: ts=2025-11-22T09:58:38.900Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 09:58:38 compute-0 podman_exporter[201392]: ts=2025-11-22T09:58:38.900Z caller=handler.go:105 level=info collector=container
Nov 22 09:58:38 compute-0 podman[201144]: @ - - [22/Nov/2025:09:58:38 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 09:58:38 compute-0 podman[201144]: time="2025-11-22T09:58:38Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 09:58:38 compute-0 podman[201377]: 2025-11-22 09:58:38.918740875 +0000 UTC m=+0.178252227 container start 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:58:38 compute-0 podman[201377]: podman_exporter
Nov 22 09:58:38 compute-0 systemd[1]: Started podman_exporter container.
Nov 22 09:58:38 compute-0 podman[201144]: @ - - [22/Nov/2025:09:58:38 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19569 "" "Go-http-client/1.1"
Nov 22 09:58:38 compute-0 podman_exporter[201392]: ts=2025-11-22T09:58:38.939Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 09:58:38 compute-0 podman_exporter[201392]: ts=2025-11-22T09:58:38.940Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 09:58:38 compute-0 podman_exporter[201392]: ts=2025-11-22T09:58:38.941Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 09:58:38 compute-0 sudo[201330]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:39 compute-0 podman[201401]: 2025-11-22 09:58:39.026604105 +0000 UTC m=+0.087239458 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:58:39 compute-0 sudo[201575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuuavbkkdytbjnxcooxzewleraehxspy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805519.2254674-748-252474604746062/AnsiballZ_stat.py'
Nov 22 09:58:39 compute-0 sudo[201575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:39 compute-0 python3.9[201577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:58:39 compute-0 sudo[201575]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:40 compute-0 sudo[201698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oespzfezvzdkqsslwpzxkcqkugmbvjnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805519.2254674-748-252474604746062/AnsiballZ_copy.py'
Nov 22 09:58:40 compute-0 sudo[201698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:40 compute-0 python3.9[201700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763805519.2254674-748-252474604746062/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 09:58:40 compute-0 sudo[201698]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:41 compute-0 sudo[201850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjzhiasztbrrnjxppxbzauhsodsxcvfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805520.749763-765-126872572386053/AnsiballZ_container_config_data.py'
Nov 22 09:58:41 compute-0 sudo[201850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:41 compute-0 python3.9[201852]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 22 09:58:41 compute-0 sudo[201850]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:41 compute-0 sudo[202002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esinxnhiqppspijikjnrirwppbhepive ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805521.6507125-774-188340440250690/AnsiballZ_container_config_hash.py'
Nov 22 09:58:41 compute-0 sudo[202002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:42 compute-0 python3.9[202004]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 09:58:42 compute-0 sudo[202002]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:42 compute-0 sudo[202154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugchdfwwboqqqajhlgueygnwyufryzvb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805522.5754828-784-229909853834101/AnsiballZ_edpm_container_manage.py'
Nov 22 09:58:42 compute-0 sudo[202154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:42 compute-0 auditd[703]: Audit daemon rotating log files
Nov 22 09:58:43 compute-0 python3[202156]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 09:58:45 compute-0 podman[202169]: 2025-11-22 09:58:45.620071756 +0000 UTC m=+2.393689292 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 09:58:45 compute-0 podman[202266]: 2025-11-22 09:58:45.757802953 +0000 UTC m=+0.044125147 container create ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6)
Nov 22 09:58:45 compute-0 podman[202266]: 2025-11-22 09:58:45.733759176 +0000 UTC m=+0.020081410 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 09:58:45 compute-0 python3[202156]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 09:58:45 compute-0 sudo[202154]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:46 compute-0 sudo[202466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwomnmeyznliuescmqnmhtaqrjlnerww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805526.1847055-792-57608631049949/AnsiballZ_stat.py'
Nov 22 09:58:46 compute-0 sudo[202466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:46 compute-0 podman[202428]: 2025-11-22 09:58:46.595869667 +0000 UTC m=+0.099396981 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:58:46 compute-0 systemd[1]: 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-41dc7bcec2963ba5.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 09:58:46 compute-0 systemd[1]: 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325-41dc7bcec2963ba5.service: Failed with result 'exit-code'.
Nov 22 09:58:46 compute-0 python3.9[202469]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:58:46 compute-0 sudo[202466]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:47 compute-0 sudo[202640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izopmzkzcvbrcnbuydqfpzgiwursumuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805527.0349426-801-239420021907838/AnsiballZ_file.py'
Nov 22 09:58:47 compute-0 sudo[202640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:47 compute-0 podman[202601]: 2025-11-22 09:58:47.475285969 +0000 UTC m=+0.109319711 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 09:58:47 compute-0 python3.9[202648]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:47 compute-0 sudo[202640]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:48 compute-0 sudo[202804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyjfdtcgmumtavuhwytuzuhtiizemgoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805527.7270179-801-244185535107657/AnsiballZ_copy.py'
Nov 22 09:58:48 compute-0 sudo[202804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:48 compute-0 python3.9[202806]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763805527.7270179-801-244185535107657/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:48 compute-0 sudo[202804]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:48 compute-0 sudo[202880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnwhihviiqzgcxsjljfdswdtuqhrlpls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805527.7270179-801-244185535107657/AnsiballZ_systemd.py'
Nov 22 09:58:48 compute-0 sudo[202880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:48 compute-0 python3.9[202882]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 09:58:48 compute-0 systemd[1]: Reloading.
Nov 22 09:58:49 compute-0 systemd-rc-local-generator[202910]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:49 compute-0 systemd-sysv-generator[202913]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:49 compute-0 sudo[202880]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:49 compute-0 sudo[202992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhizksnhwtqlfrkoxvrvxszkjhjtfnga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805527.7270179-801-244185535107657/AnsiballZ_systemd.py'
Nov 22 09:58:49 compute-0 sudo[202992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:49 compute-0 python3.9[202994]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 09:58:49 compute-0 systemd[1]: Reloading.
Nov 22 09:58:50 compute-0 systemd-rc-local-generator[203023]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 09:58:50 compute-0 systemd-sysv-generator[203027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 09:58:50 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 22 09:58:50 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60fafd67e95b0bc9ec2084ab25bc252d72a37c0dee727553fa049279419fc7b1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60fafd67e95b0bc9ec2084ab25bc252d72a37c0dee727553fa049279419fc7b1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60fafd67e95b0bc9ec2084ab25bc252d72a37c0dee727553fa049279419fc7b1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:50 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.
Nov 22 09:58:50 compute-0 podman[203033]: 2025-11-22 09:58:50.453047185 +0000 UTC m=+0.140288128 container init ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, distribution-scope=public)
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *bridge.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *coverage.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *datapath.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *iface.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *memory.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *ovnnorthd.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *ovn.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *ovsdbserver.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *pmd_perf.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *pmd_rxq.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: INFO    09:58:50 main.go:48: registering *vswitch.Collector
Nov 22 09:58:50 compute-0 openstack_network_exporter[203048]: NOTICE  09:58:50 main.go:76: listening on https://:9105/metrics
Nov 22 09:58:50 compute-0 podman[203033]: 2025-11-22 09:58:50.482726268 +0000 UTC m=+0.169967131 container start ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7)
Nov 22 09:58:50 compute-0 podman[203033]: openstack_network_exporter
Nov 22 09:58:50 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 22 09:58:50 compute-0 sudo[202992]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:50 compute-0 podman[203058]: 2025-11-22 09:58:50.567313771 +0000 UTC m=+0.075058144 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 09:58:51 compute-0 sudo[203230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nacxixptwubmwnvxqqvxaydzafgknwkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805530.701994-825-104385114026061/AnsiballZ_systemd.py'
Nov 22 09:58:51 compute-0 sudo[203230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:51 compute-0 python3.9[203232]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 09:58:51 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 22 09:58:51 compute-0 systemd[1]: libpod-ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.scope: Deactivated successfully.
Nov 22 09:58:51 compute-0 conmon[203048]: conmon ff15f44cf5a5d558d855 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.scope/container/memory.events
Nov 22 09:58:51 compute-0 podman[203236]: 2025-11-22 09:58:51.575391414 +0000 UTC m=+0.187901961 container died ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 22 09:58:51 compute-0 systemd[1]: ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296-5c32c3162232250c.timer: Deactivated successfully.
Nov 22 09:58:51 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.
Nov 22 09:58:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296-userdata-shm.mount: Deactivated successfully.
Nov 22 09:58:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-60fafd67e95b0bc9ec2084ab25bc252d72a37c0dee727553fa049279419fc7b1-merged.mount: Deactivated successfully.
Nov 22 09:58:52 compute-0 podman[203236]: 2025-11-22 09:58:52.390993741 +0000 UTC m=+1.003504338 container cleanup ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, architecture=x86_64)
Nov 22 09:58:52 compute-0 podman[203236]: openstack_network_exporter
Nov 22 09:58:52 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 09:58:52 compute-0 podman[203264]: openstack_network_exporter
Nov 22 09:58:52 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 22 09:58:52 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 22 09:58:52 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 22 09:58:52 compute-0 systemd[1]: Started libcrun container.
Nov 22 09:58:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60fafd67e95b0bc9ec2084ab25bc252d72a37c0dee727553fa049279419fc7b1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60fafd67e95b0bc9ec2084ab25bc252d72a37c0dee727553fa049279419fc7b1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60fafd67e95b0bc9ec2084ab25bc252d72a37c0dee727553fa049279419fc7b1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 09:58:52 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.
Nov 22 09:58:52 compute-0 podman[203277]: 2025-11-22 09:58:52.626506583 +0000 UTC m=+0.141040229 container init ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git)
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *bridge.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *coverage.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *datapath.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *iface.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *memory.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *ovnnorthd.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *ovn.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *ovsdbserver.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *pmd_perf.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *pmd_rxq.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: INFO    09:58:52 main.go:48: registering *vswitch.Collector
Nov 22 09:58:52 compute-0 openstack_network_exporter[203292]: NOTICE  09:58:52 main.go:76: listening on https://:9105/metrics
Nov 22 09:58:52 compute-0 podman[203277]: 2025-11-22 09:58:52.653023498 +0000 UTC m=+0.167557094 container start ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 09:58:52 compute-0 podman[203277]: openstack_network_exporter
Nov 22 09:58:52 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 22 09:58:52 compute-0 podman[203302]: 2025-11-22 09:58:52.697188647 +0000 UTC m=+0.044702614 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:58:52 compute-0 sudo[203230]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:52 compute-0 podman[203303]: 2025-11-22 09:58:52.731566967 +0000 UTC m=+0.069402610 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Nov 22 09:58:53 compute-0 sudo[203490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fobthevzoeaspdsxqwzbzliblxiadjjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805532.8803258-833-166133444055317/AnsiballZ_find.py'
Nov 22 09:58:53 compute-0 sudo[203490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:53 compute-0 python3.9[203492]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 09:58:53 compute-0 sudo[203490]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:54 compute-0 sudo[203642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpmurzqzjjjhmfpzfyoljzrntkxtvdpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805533.9660087-843-241982461411033/AnsiballZ_podman_container_info.py'
Nov 22 09:58:54 compute-0 sudo[203642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:54 compute-0 python3.9[203644]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 22 09:58:54 compute-0 sudo[203642]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:55 compute-0 sudo[203807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojznnogaakpbfnlhonhcdfzraikcwjzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805534.933604-851-192055562298401/AnsiballZ_podman_container_exec.py'
Nov 22 09:58:55 compute-0 sudo[203807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:55 compute-0 python3.9[203809]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:58:55 compute-0 systemd[1]: Started libpod-conmon-e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e.scope.
Nov 22 09:58:55 compute-0 podman[203810]: 2025-11-22 09:58:55.724315643 +0000 UTC m=+0.088920724 container exec e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 09:58:55 compute-0 podman[203810]: 2025-11-22 09:58:55.758017584 +0000 UTC m=+0.122622665 container exec_died e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 09:58:55 compute-0 sudo[203807]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:55 compute-0 systemd[1]: libpod-conmon-e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e.scope: Deactivated successfully.
Nov 22 09:58:56 compute-0 sudo[203992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oljtylybcrnyfywuhiqvyncfzuhlfrpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805536.002285-859-250093091025849/AnsiballZ_podman_container_exec.py'
Nov 22 09:58:56 compute-0 sudo[203992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:56 compute-0 python3.9[203994]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:58:56 compute-0 systemd[1]: Started libpod-conmon-e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e.scope.
Nov 22 09:58:56 compute-0 podman[203995]: 2025-11-22 09:58:56.655487862 +0000 UTC m=+0.101496638 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:58:56 compute-0 podman[204001]: 2025-11-22 09:58:56.66347582 +0000 UTC m=+0.078807387 container exec e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 09:58:56 compute-0 podman[204001]: 2025-11-22 09:58:56.695121136 +0000 UTC m=+0.110452693 container exec_died e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 09:58:56 compute-0 systemd[1]: libpod-conmon-e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e.scope: Deactivated successfully.
Nov 22 09:58:56 compute-0 sudo[203992]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:57 compute-0 sudo[204199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyhsuffehjajyrvnpuetiorahcgpjprz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805536.9343617-867-143655507314513/AnsiballZ_file.py'
Nov 22 09:58:57 compute-0 sudo[204199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:57 compute-0 python3.9[204201]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:58:57 compute-0 sudo[204199]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:58 compute-0 sudo[204368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufmdvntgmsliuipwldnhardbfjkolrom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805537.8345475-876-19234531922925/AnsiballZ_podman_container_info.py'
Nov 22 09:58:58 compute-0 sudo[204368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:58 compute-0 podman[204325]: 2025-11-22 09:58:58.159435947 +0000 UTC m=+0.070836598 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 09:58:58 compute-0 python3.9[204373]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 22 09:58:58 compute-0 sudo[204368]: pam_unix(sudo:session): session closed for user root
Nov 22 09:58:59 compute-0 sudo[204533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqsnidabawqfldfrjukohtaugdbytkny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805538.6605027-884-38036508961721/AnsiballZ_podman_container_exec.py'
Nov 22 09:58:59 compute-0 sudo[204533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:58:59 compute-0 python3.9[204535]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:58:59 compute-0 systemd[1]: Started libpod-conmon-6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f.scope.
Nov 22 09:58:59 compute-0 podman[204536]: 2025-11-22 09:58:59.333233552 +0000 UTC m=+0.086260800 container exec 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 09:58:59 compute-0 podman[204536]: 2025-11-22 09:58:59.370190623 +0000 UTC m=+0.123217841 container exec_died 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 09:58:59 compute-0 systemd[1]: libpod-conmon-6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f.scope: Deactivated successfully.
Nov 22 09:58:59 compute-0 sudo[204533]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:00 compute-0 sudo[204716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xobwnpzauirmpxfnurxxkfkyhubbdtao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805539.6480615-892-74497482138314/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:00 compute-0 sudo[204716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:00 compute-0 python3.9[204718]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:00 compute-0 systemd[1]: Started libpod-conmon-6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f.scope.
Nov 22 09:59:00 compute-0 podman[204719]: 2025-11-22 09:59:00.346170388 +0000 UTC m=+0.084446261 container exec 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 09:59:00 compute-0 podman[204719]: 2025-11-22 09:59:00.377759841 +0000 UTC m=+0.116035744 container exec_died 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 09:59:00 compute-0 systemd[1]: libpod-conmon-6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f.scope: Deactivated successfully.
Nov 22 09:59:00 compute-0 sudo[204716]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:01 compute-0 sudo[204899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsvnfsfjipnmpuevaqhzveahdrsuzngf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805540.6588936-900-56041740042135/AnsiballZ_file.py'
Nov 22 09:59:01 compute-0 sudo[204899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:01 compute-0 python3.9[204901]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:01 compute-0 sudo[204899]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:01 compute-0 sudo[205051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwpkjtnbxdhezgdyhpcdirkyltfbqgzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805541.462948-909-243305160517987/AnsiballZ_podman_container_info.py'
Nov 22 09:59:01 compute-0 sudo[205051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:01 compute-0 python3.9[205053]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 22 09:59:02 compute-0 sudo[205051]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:02 compute-0 sudo[205216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wphcrlscemmwkqanogrnipjzmslyuklp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805542.32404-917-62306593691063/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:02 compute-0 sudo[205216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:02 compute-0 python3.9[205218]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:02 compute-0 systemd[1]: Started libpod-conmon-a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.scope.
Nov 22 09:59:02 compute-0 podman[205219]: 2025-11-22 09:59:02.920716735 +0000 UTC m=+0.067725553 container exec a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 09:59:02 compute-0 podman[205219]: 2025-11-22 09:59:02.925187798 +0000 UTC m=+0.072196676 container exec_died a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 09:59:02 compute-0 sudo[205216]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:02 compute-0 systemd[1]: libpod-conmon-a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.scope: Deactivated successfully.
Nov 22 09:59:03 compute-0 sudo[205401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opgzasltrgwlajecctfevwxquowtbelp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805543.2019472-925-101301782617061/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:03 compute-0 sudo[205401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:03 compute-0 python3.9[205403]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:03 compute-0 systemd[1]: Started libpod-conmon-a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.scope.
Nov 22 09:59:03 compute-0 podman[205404]: 2025-11-22 09:59:03.831662951 +0000 UTC m=+0.081366856 container exec a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 09:59:03 compute-0 podman[205404]: 2025-11-22 09:59:03.861073816 +0000 UTC m=+0.110777731 container exec_died a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118)
Nov 22 09:59:03 compute-0 systemd[1]: libpod-conmon-a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323.scope: Deactivated successfully.
Nov 22 09:59:03 compute-0 sudo[205401]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:04 compute-0 sudo[205586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dclqdmgvljglvqyrkxcrwhfplcvzqcln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805544.1059375-933-250967347344651/AnsiballZ_file.py'
Nov 22 09:59:04 compute-0 sudo[205586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:04 compute-0 python3.9[205588]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:04 compute-0 sudo[205586]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:05 compute-0 sudo[205738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihsjzdsjuanzoyncxyghfhfdahtfdwen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805544.8914638-942-227271986793229/AnsiballZ_podman_container_info.py'
Nov 22 09:59:05 compute-0 sudo[205738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:05 compute-0 python3.9[205740]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 22 09:59:05 compute-0 sudo[205738]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:06 compute-0 sudo[205903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibnvrxekuawqtbhnmqbcmmjzytzxzqwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805545.7393465-950-204963852570615/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:06 compute-0 sudo[205903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:06 compute-0 python3.9[205905]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:06 compute-0 systemd[1]: Started libpod-conmon-378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.scope.
Nov 22 09:59:06 compute-0 podman[205906]: 2025-11-22 09:59:06.540906823 +0000 UTC m=+0.077619674 container exec 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 09:59:06 compute-0 podman[205906]: 2025-11-22 09:59:06.550839315 +0000 UTC m=+0.087552136 container exec_died 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 09:59:06 compute-0 sudo[205903]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:06 compute-0 systemd[1]: libpod-conmon-378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.scope: Deactivated successfully.
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.019 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:07 compute-0 sudo[206087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqggwklssafofeksjqwhcpklnoygifrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805546.8504875-958-67835743737139/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:07 compute-0 sudo[206087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:07 compute-0 python3.9[206089]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:07 compute-0 systemd[1]: Started libpod-conmon-378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.scope.
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:59:07 compute-0 podman[206090]: 2025-11-22 09:59:07.599554789 +0000 UTC m=+0.098421563 container exec 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm)
Nov 22 09:59:07 compute-0 podman[206090]: 2025-11-22 09:59:07.634994118 +0000 UTC m=+0.133860882 container exec_died 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.638 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.639 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.639 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.639 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.639 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:07 compute-0 nova_compute[186981]: 2025-11-22 09:59:07.639 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:59:07 compute-0 systemd[1]: libpod-conmon-378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325.scope: Deactivated successfully.
Nov 22 09:59:07 compute-0 sudo[206087]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:08 compute-0 sudo[206269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddxdbdgehkicyeqiowpglkgzjagucdpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805547.8512049-966-208505183679385/AnsiballZ_file.py'
Nov 22 09:59:08 compute-0 sudo[206269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:08 compute-0 python3.9[206271]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:08 compute-0 sudo[206269]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.651 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.652 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.652 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.653 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.826 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.828 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5926MB free_disk=73.49360656738281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.828 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:59:08 compute-0 nova_compute[186981]: 2025-11-22 09:59:08.828 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:59:08 compute-0 sudo[206421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhdroesxpzjdsdscyvzbkelacfcuvcqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805548.6288009-975-174567899735095/AnsiballZ_podman_container_info.py'
Nov 22 09:59:08 compute-0 sudo[206421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:09 compute-0 nova_compute[186981]: 2025-11-22 09:59:09.056 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:59:09 compute-0 nova_compute[186981]: 2025-11-22 09:59:09.056 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:59:09 compute-0 nova_compute[186981]: 2025-11-22 09:59:09.094 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:59:09 compute-0 python3.9[206423]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 22 09:59:09 compute-0 nova_compute[186981]: 2025-11-22 09:59:09.207 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:59:09 compute-0 nova_compute[186981]: 2025-11-22 09:59:09.208 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:59:09 compute-0 nova_compute[186981]: 2025-11-22 09:59:09.208 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:59:09 compute-0 sudo[206421]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:09 compute-0 podman[206521]: 2025-11-22 09:59:09.627033734 +0000 UTC m=+0.076293858 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 09:59:09 compute-0 sudo[206611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtrmbznclpubvdjtecbqqdtfciztmsvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805549.423809-983-200474576757755/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:09 compute-0 sudo[206611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:09 compute-0 python3.9[206613]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:10 compute-0 systemd[1]: Started libpod-conmon-6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.scope.
Nov 22 09:59:10 compute-0 podman[206614]: 2025-11-22 09:59:10.031961199 +0000 UTC m=+0.080861302 container exec 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 09:59:10 compute-0 podman[206614]: 2025-11-22 09:59:10.061736133 +0000 UTC m=+0.110636256 container exec_died 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 09:59:10 compute-0 systemd[1]: libpod-conmon-6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.scope: Deactivated successfully.
Nov 22 09:59:10 compute-0 sudo[206611]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:10 compute-0 sudo[206795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tegupavvebofjgsjukcyyfikvtizabmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805550.2881827-991-73987443702144/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:10 compute-0 sudo[206795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:10 compute-0 python3.9[206797]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:10 compute-0 systemd[1]: Started libpod-conmon-6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.scope.
Nov 22 09:59:10 compute-0 podman[206798]: 2025-11-22 09:59:10.885798323 +0000 UTC m=+0.086397015 container exec 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:59:10 compute-0 podman[206798]: 2025-11-22 09:59:10.917116169 +0000 UTC m=+0.117714891 container exec_died 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:59:10 compute-0 systemd[1]: libpod-conmon-6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc.scope: Deactivated successfully.
Nov 22 09:59:10 compute-0 sudo[206795]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:11 compute-0 sudo[206978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyexkqbvnofmhgcosvrndbmjicaudhya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805551.1835492-999-231897822726038/AnsiballZ_file.py'
Nov 22 09:59:11 compute-0 sudo[206978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:11 compute-0 python3.9[206980]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:11 compute-0 sudo[206978]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:12 compute-0 sudo[207130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldmkzxeqtsccwaohykajiemtvkmonnlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805551.9913774-1008-29663664704156/AnsiballZ_podman_container_info.py'
Nov 22 09:59:12 compute-0 sudo[207130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:12 compute-0 python3.9[207132]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 22 09:59:12 compute-0 sudo[207130]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:13 compute-0 sudo[207296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iahzrlgqshvazqnbxpnahcpunsjtngwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805552.7930033-1016-46780611785259/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:13 compute-0 sudo[207296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:13 compute-0 python3.9[207298]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:13 compute-0 systemd[1]: Started libpod-conmon-2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.scope.
Nov 22 09:59:13 compute-0 podman[207299]: 2025-11-22 09:59:13.450960564 +0000 UTC m=+0.096530422 container exec 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 09:59:13 compute-0 podman[207299]: 2025-11-22 09:59:13.480982735 +0000 UTC m=+0.126552593 container exec_died 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:59:13 compute-0 systemd[1]: libpod-conmon-2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.scope: Deactivated successfully.
Nov 22 09:59:13 compute-0 sudo[207296]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:14 compute-0 sudo[207480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cihbrcxjxsczuolalyunofjcuwbabpwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805553.7661903-1024-37100315583337/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:14 compute-0 sudo[207480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:14 compute-0 python3.9[207482]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:14 compute-0 systemd[1]: Started libpod-conmon-2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.scope.
Nov 22 09:59:14 compute-0 podman[207483]: 2025-11-22 09:59:14.431556525 +0000 UTC m=+0.079200417 container exec 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 09:59:14 compute-0 podman[207483]: 2025-11-22 09:59:14.461874704 +0000 UTC m=+0.109518606 container exec_died 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:59:14 compute-0 systemd[1]: libpod-conmon-2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b.scope: Deactivated successfully.
Nov 22 09:59:14 compute-0 sudo[207480]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:14 compute-0 sudo[207664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrzohtzymjjwmusptqwocxuzfggduvan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805554.7038674-1032-172564983863320/AnsiballZ_file.py'
Nov 22 09:59:14 compute-0 sudo[207664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:15 compute-0 python3.9[207666]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:15 compute-0 sudo[207664]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:15 compute-0 sudo[207816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlnbyacvwmcrtmatqjntpmfdctgmrlkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805555.463779-1041-191150259809261/AnsiballZ_podman_container_info.py'
Nov 22 09:59:15 compute-0 sudo[207816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:15 compute-0 python3.9[207818]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 22 09:59:16 compute-0 sudo[207816]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:16 compute-0 sudo[207981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjgmscuocemnnycpvbmsitqmmokoxkxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805556.2537313-1049-199607908857622/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:16 compute-0 sudo[207981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:16 compute-0 python3.9[207983]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:16 compute-0 systemd[1]: Started libpod-conmon-ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.scope.
Nov 22 09:59:16 compute-0 podman[207984]: 2025-11-22 09:59:16.819509128 +0000 UTC m=+0.077507590 container exec ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 09:59:16 compute-0 podman[207984]: 2025-11-22 09:59:16.849407596 +0000 UTC m=+0.107406038 container exec_died ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 09:59:16 compute-0 systemd[1]: libpod-conmon-ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.scope: Deactivated successfully.
Nov 22 09:59:16 compute-0 sudo[207981]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:16 compute-0 podman[208001]: 2025-11-22 09:59:16.925361354 +0000 UTC m=+0.089142070 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 09:59:17 compute-0 sudo[208182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwymbqkyntfydqzctjwoaitdbifcdskk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805557.082228-1057-93943297136427/AnsiballZ_podman_container_exec.py'
Nov 22 09:59:17 compute-0 sudo[208182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:17 compute-0 python3.9[208184]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 09:59:17 compute-0 podman[208185]: 2025-11-22 09:59:17.706322454 +0000 UTC m=+0.149280323 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:59:17 compute-0 systemd[1]: Started libpod-conmon-ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.scope.
Nov 22 09:59:17 compute-0 podman[208211]: 2025-11-22 09:59:17.790395884 +0000 UTC m=+0.083368672 container exec ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Nov 22 09:59:17 compute-0 podman[208211]: 2025-11-22 09:59:17.800045047 +0000 UTC m=+0.093017855 container exec_died ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 09:59:17 compute-0 systemd[1]: libpod-conmon-ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296.scope: Deactivated successfully.
Nov 22 09:59:17 compute-0 sudo[208182]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:59:17.926 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:59:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:59:17.928 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:59:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 09:59:17.929 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:59:18 compute-0 sudo[208394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzwsvtqsemflcamgglcwwxjnlicnukly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805558.0619824-1065-45362235930305/AnsiballZ_file.py'
Nov 22 09:59:18 compute-0 sudo[208394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:18 compute-0 python3.9[208396]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:18 compute-0 sudo[208394]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:19 compute-0 sudo[208546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkwyqheevyoucfzrzpdwpnlinlubhbgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805558.8847187-1074-76733239509223/AnsiballZ_file.py'
Nov 22 09:59:19 compute-0 sudo[208546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:19 compute-0 python3.9[208548]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:19 compute-0 sudo[208546]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:20 compute-0 sudo[208698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaskiwvorjzvtfiusbjwmxmuaysudinz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805559.646239-1082-84770604229988/AnsiballZ_stat.py'
Nov 22 09:59:20 compute-0 sudo[208698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:20 compute-0 python3.9[208700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:20 compute-0 sudo[208698]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:20 compute-0 sudo[208821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wivqekrgyaoqxdpimgdzxumsqtiwtbra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805559.646239-1082-84770604229988/AnsiballZ_copy.py'
Nov 22 09:59:20 compute-0 sudo[208821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:20 compute-0 python3.9[208823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763805559.646239-1082-84770604229988/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:20 compute-0 sudo[208821]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:21 compute-0 sudo[208973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mftddzxifuzgszifawtkjphaqwjtzhjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805561.1781735-1098-64654890753821/AnsiballZ_file.py'
Nov 22 09:59:21 compute-0 sudo[208973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:21 compute-0 python3.9[208975]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:21 compute-0 sudo[208973]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:22 compute-0 sudo[209125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jihqdodcjqvwhakronkvabnjdcewhqya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805562.0069904-1106-27173029949111/AnsiballZ_stat.py'
Nov 22 09:59:22 compute-0 sudo[209125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:22 compute-0 python3.9[209127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:22 compute-0 sudo[209125]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:22 compute-0 sudo[209233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbmuskthfqxogxzjllhudfbligglupiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805562.0069904-1106-27173029949111/AnsiballZ_file.py'
Nov 22 09:59:22 compute-0 sudo[209233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:22 compute-0 podman[209177]: 2025-11-22 09:59:22.883395645 +0000 UTC m=+0.079340981 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:59:22 compute-0 podman[209178]: 2025-11-22 09:59:22.907344611 +0000 UTC m=+0.105297352 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 09:59:23 compute-0 python3.9[209242]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:23 compute-0 sudo[209233]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:23 compute-0 sudo[209393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osssymvzymcdvgldvclizlprheuciymg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805563.288809-1118-189969903045479/AnsiballZ_stat.py'
Nov 22 09:59:23 compute-0 sudo[209393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:23 compute-0 python3.9[209395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:23 compute-0 sudo[209393]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:24 compute-0 sudo[209471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izeqakavetgeqcntgmosggqgeezieodc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805563.288809-1118-189969903045479/AnsiballZ_file.py'
Nov 22 09:59:24 compute-0 sudo[209471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:24 compute-0 python3.9[209473]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.iaiptc_m recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:24 compute-0 sudo[209471]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:24 compute-0 sudo[209623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wknkveehlqgftrlfrsbuvkazkgopjuie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805564.6027465-1130-259199725447156/AnsiballZ_stat.py'
Nov 22 09:59:24 compute-0 sudo[209623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:25 compute-0 python3.9[209625]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:25 compute-0 sudo[209623]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:25 compute-0 sudo[209701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnqjhmhgnpjhprndsavrjubmgjpdvxvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805564.6027465-1130-259199725447156/AnsiballZ_file.py'
Nov 22 09:59:25 compute-0 sudo[209701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:25 compute-0 python3.9[209703]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:25 compute-0 sudo[209701]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:26 compute-0 sudo[209853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mseoabqvyouttqnybiohasqjzzgfbjog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805565.7618265-1143-113505026107372/AnsiballZ_command.py'
Nov 22 09:59:26 compute-0 sudo[209853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:26 compute-0 python3.9[209855]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:59:26 compute-0 sudo[209853]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:27 compute-0 sudo[210017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzyrclgjitshjssayapuzwtaqunrcifo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763805566.4960973-1151-246298268692461/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 09:59:27 compute-0 sudo[210017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:27 compute-0 podman[209980]: 2025-11-22 09:59:27.136984787 +0000 UTC m=+0.094575267 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:59:27 compute-0 python3[210023]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 09:59:27 compute-0 sudo[210017]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:27 compute-0 sudo[210180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsvmqksjponywipwpquuywpaedtpiptd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805567.5832052-1159-57376575587466/AnsiballZ_stat.py'
Nov 22 09:59:27 compute-0 sudo[210180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:28 compute-0 python3.9[210182]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:28 compute-0 sudo[210180]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:28 compute-0 sudo[210269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkhgimcpnjpwstnotnijudbyigybkvpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805567.5832052-1159-57376575587466/AnsiballZ_file.py'
Nov 22 09:59:28 compute-0 sudo[210269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:28 compute-0 podman[210232]: 2025-11-22 09:59:28.43645238 +0000 UTC m=+0.075697212 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 09:59:28 compute-0 python3.9[210280]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:28 compute-0 sudo[210269]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:29 compute-0 sudo[210430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlheksldybvghotiohtskflnuqnmkuhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805568.823895-1171-271607845332355/AnsiballZ_stat.py'
Nov 22 09:59:29 compute-0 sudo[210430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:29 compute-0 python3.9[210432]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:29 compute-0 sudo[210430]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:29 compute-0 sudo[210508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyhnplfwjpmpqtoqjjxtzybehwcujnya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805568.823895-1171-271607845332355/AnsiballZ_file.py'
Nov 22 09:59:29 compute-0 sudo[210508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:29 compute-0 python3.9[210510]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:29 compute-0 sudo[210508]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:30 compute-0 sudo[210660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jayriqxfxbbfnlujrpcwyznwxdigtykg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805570.0220811-1183-24693851724698/AnsiballZ_stat.py'
Nov 22 09:59:30 compute-0 sudo[210660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:30 compute-0 python3.9[210662]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:30 compute-0 sudo[210660]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:30 compute-0 sudo[210738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coeugaojfytttruujcnnziovcqqpcfhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805570.0220811-1183-24693851724698/AnsiballZ_file.py'
Nov 22 09:59:30 compute-0 sudo[210738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:30 compute-0 python3.9[210740]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:30 compute-0 sudo[210738]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:31 compute-0 sudo[210890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzekwfxmlrtynsftnidomipanihhbeqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805571.141381-1195-10473759475854/AnsiballZ_stat.py'
Nov 22 09:59:31 compute-0 sudo[210890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:31 compute-0 python3.9[210892]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:31 compute-0 sudo[210890]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:31 compute-0 sudo[210968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xixkmwlfhepajmaqencmulqfbrlbxqgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805571.141381-1195-10473759475854/AnsiballZ_file.py'
Nov 22 09:59:31 compute-0 sudo[210968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:32 compute-0 python3.9[210970]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:32 compute-0 sudo[210968]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:32 compute-0 sudo[211120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpnmmmqsgeydjgbunkwizeksvnswpsfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805572.3578093-1207-233040968729412/AnsiballZ_stat.py'
Nov 22 09:59:32 compute-0 sudo[211120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:32 compute-0 python3.9[211122]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 09:59:33 compute-0 sudo[211120]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:33 compute-0 sudo[211245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yreuplnmtrirrgpkogkphmaffwwzlzql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805572.3578093-1207-233040968729412/AnsiballZ_copy.py'
Nov 22 09:59:33 compute-0 sudo[211245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:33 compute-0 python3.9[211247]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763805572.3578093-1207-233040968729412/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:33 compute-0 sudo[211245]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:34 compute-0 sudo[211397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecilowmdonkvtoumrzubuqwdscbpfhga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805573.8529165-1222-228289224445608/AnsiballZ_file.py'
Nov 22 09:59:34 compute-0 sudo[211397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:34 compute-0 python3.9[211399]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:34 compute-0 sudo[211397]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:34 compute-0 sudo[211549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orfltfgpbnnkvchqudambuooxhnejudo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805574.6228416-1230-63586582022143/AnsiballZ_command.py'
Nov 22 09:59:34 compute-0 sudo[211549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:35 compute-0 python3.9[211551]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:59:35 compute-0 sudo[211549]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:35 compute-0 sudo[211704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sawqnjnlwykmqiuhqevnabuovjboarsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805575.360402-1238-40357783852664/AnsiballZ_blockinfile.py'
Nov 22 09:59:35 compute-0 sudo[211704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:36 compute-0 python3.9[211706]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:36 compute-0 sudo[211704]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:36 compute-0 sudo[211856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-einrufxbeijehstydudijlaarsdzlbwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805576.3755658-1247-9984037370628/AnsiballZ_command.py'
Nov 22 09:59:36 compute-0 sudo[211856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:36 compute-0 python3.9[211858]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:59:36 compute-0 sudo[211856]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:37 compute-0 sudo[212009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsvzqzmjawcssywhzxqaaplvrwdmwdrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805577.1355028-1255-64135214617988/AnsiballZ_stat.py'
Nov 22 09:59:37 compute-0 sudo[212009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:37 compute-0 python3.9[212011]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 09:59:37 compute-0 sudo[212009]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:38 compute-0 sudo[212163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncvlcfqvgiszsttcjorzislhabdexext ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805578.1232882-1263-132753088501306/AnsiballZ_command.py'
Nov 22 09:59:38 compute-0 sudo[212163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:38 compute-0 python3.9[212165]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 09:59:38 compute-0 sudo[212163]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:39 compute-0 sudo[212318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzyrhgabypqrhqsefdcchqhbbmjfgypp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763805578.8489609-1271-197903817462346/AnsiballZ_file.py'
Nov 22 09:59:39 compute-0 sudo[212318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:59:39 compute-0 python3.9[212320]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 09:59:39 compute-0 sudo[212318]: pam_unix(sudo:session): session closed for user root
Nov 22 09:59:39 compute-0 sshd-session[187282]: Connection closed by 192.168.122.30 port 33972
Nov 22 09:59:39 compute-0 sshd-session[187279]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:59:39 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 22 09:59:39 compute-0 systemd[1]: session-25.scope: Consumed 1min 47.851s CPU time.
Nov 22 09:59:39 compute-0 systemd-logind[819]: Session 25 logged out. Waiting for processes to exit.
Nov 22 09:59:39 compute-0 systemd-logind[819]: Removed session 25.
Nov 22 09:59:39 compute-0 podman[212345]: 2025-11-22 09:59:39.966437639 +0000 UTC m=+0.096622174 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 09:59:47 compute-0 podman[212371]: 2025-11-22 09:59:47.637557821 +0000 UTC m=+0.087122539 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=edpm)
Nov 22 09:59:48 compute-0 podman[212392]: 2025-11-22 09:59:48.678508835 +0000 UTC m=+0.124696257 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 09:59:54 compute-0 podman[212419]: 2025-11-22 09:59:54.27242871 +0000 UTC m=+0.064550470 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:59:54 compute-0 podman[212420]: 2025-11-22 09:59:54.273503399 +0000 UTC m=+0.066011840 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 22 09:59:57 compute-0 podman[212457]: 2025-11-22 09:59:57.628553483 +0000 UTC m=+0.075395043 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:59:58 compute-0 podman[212481]: 2025-11-22 09:59:58.598693032 +0000 UTC m=+0.059879540 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.210 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.210 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.211 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.264 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:08 compute-0 nova_compute[186981]: 2025-11-22 10:00:08.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.814 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.814 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.814 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:00:09 compute-0 nova_compute[186981]: 2025-11-22 10:00:09.815 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.018 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.020 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6009MB free_disk=73.49288940429688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.020 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.020 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.275 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.276 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.300 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.340 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.344 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:00:10 compute-0 nova_compute[186981]: 2025-11-22 10:00:10.345 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:00:10 compute-0 podman[212501]: 2025-11-22 10:00:10.653653538 +0000 UTC m=+0.100756179 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:00:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:00:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:00:17.927 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:00:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:00:17.928 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:00:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:00:17.928 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:00:18 compute-0 podman[212526]: 2025-11-22 10:00:18.592434049 +0000 UTC m=+0.053759330 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:00:19 compute-0 podman[212546]: 2025-11-22 10:00:19.634290837 +0000 UTC m=+0.090525964 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:00:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:00:22.514 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:00:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:00:22.515 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:00:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:00:22.516 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:00:24 compute-0 podman[212572]: 2025-11-22 10:00:24.602857582 +0000 UTC m=+0.057927216 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:00:24 compute-0 podman[212573]: 2025-11-22 10:00:24.610447002 +0000 UTC m=+0.058660295 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Nov 22 10:00:28 compute-0 podman[212613]: 2025-11-22 10:00:28.619541706 +0000 UTC m=+0.071355751 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:00:28 compute-0 podman[212639]: 2025-11-22 10:00:28.720216261 +0000 UTC m=+0.071832583 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 10:00:41 compute-0 podman[212660]: 2025-11-22 10:00:41.617037852 +0000 UTC m=+0.070539161 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:00:49 compute-0 podman[212684]: 2025-11-22 10:00:49.59480481 +0000 UTC m=+0.056704076 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:00:50 compute-0 podman[212705]: 2025-11-22 10:00:50.657902627 +0000 UTC m=+0.112142309 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:00:55 compute-0 podman[212731]: 2025-11-22 10:00:55.594838965 +0000 UTC m=+0.048539976 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 10:00:55 compute-0 podman[212732]: 2025-11-22 10:00:55.613362626 +0000 UTC m=+0.065095914 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal)
Nov 22 10:00:59 compute-0 podman[212769]: 2025-11-22 10:00:59.6293906 +0000 UTC m=+0.074295574 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 10:00:59 compute-0 podman[212770]: 2025-11-22 10:00:59.643356659 +0000 UTC m=+0.083962796 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:01:01 compute-0 CROND[212812]: (root) CMD (run-parts /etc/cron.hourly)
Nov 22 10:01:01 compute-0 run-parts[212815]: (/etc/cron.hourly) starting 0anacron
Nov 22 10:01:01 compute-0 anacron[212823]: Anacron started on 2025-11-22
Nov 22 10:01:01 compute-0 anacron[212823]: Will run job `cron.daily' in 43 min.
Nov 22 10:01:01 compute-0 anacron[212823]: Will run job `cron.weekly' in 63 min.
Nov 22 10:01:01 compute-0 anacron[212823]: Will run job `cron.monthly' in 83 min.
Nov 22 10:01:01 compute-0 anacron[212823]: Jobs will be executed sequentially
Nov 22 10:01:01 compute-0 run-parts[212825]: (/etc/cron.hourly) finished 0anacron
Nov 22 10:01:01 compute-0 CROND[212811]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.345 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.346 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.346 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.795 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.795 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.796 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.796 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.796 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:09 compute-0 nova_compute[186981]: 2025-11-22 10:01:09.796 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.775 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.777 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.777 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.778 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.956 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.957 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6053MB free_disk=73.4967155456543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.957 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:01:10 compute-0 nova_compute[186981]: 2025-11-22 10:01:10.958 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:01:11 compute-0 nova_compute[186981]: 2025-11-22 10:01:11.851 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:01:11 compute-0 nova_compute[186981]: 2025-11-22 10:01:11.852 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:01:11 compute-0 nova_compute[186981]: 2025-11-22 10:01:11.886 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:01:12 compute-0 podman[212826]: 2025-11-22 10:01:12.609945371 +0000 UTC m=+0.065588208 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:01:12 compute-0 nova_compute[186981]: 2025-11-22 10:01:12.808 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:01:12 compute-0 nova_compute[186981]: 2025-11-22 10:01:12.809 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:01:12 compute-0 nova_compute[186981]: 2025-11-22 10:01:12.809 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:01:13 compute-0 nova_compute[186981]: 2025-11-22 10:01:13.803 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:13 compute-0 nova_compute[186981]: 2025-11-22 10:01:13.804 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:14 compute-0 nova_compute[186981]: 2025-11-22 10:01:14.273 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:01:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:01:17.928 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:01:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:01:17.929 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:01:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:01:17.929 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:01:20 compute-0 podman[212850]: 2025-11-22 10:01:20.603807544 +0000 UTC m=+0.060198411 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 22 10:01:21 compute-0 podman[212870]: 2025-11-22 10:01:21.63547894 +0000 UTC m=+0.097204074 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:01:26 compute-0 podman[212897]: 2025-11-22 10:01:26.59976975 +0000 UTC m=+0.051625410 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 10:01:26 compute-0 podman[212898]: 2025-11-22 10:01:26.624404326 +0000 UTC m=+0.070014997 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 10:01:30 compute-0 podman[212936]: 2025-11-22 10:01:30.644297915 +0000 UTC m=+0.091265153 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 10:01:30 compute-0 podman[212937]: 2025-11-22 10:01:30.658479379 +0000 UTC m=+0.094217853 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd)
Nov 22 10:01:43 compute-0 podman[212981]: 2025-11-22 10:01:43.642148186 +0000 UTC m=+0.084436760 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:01:51 compute-0 podman[213006]: 2025-11-22 10:01:51.637905414 +0000 UTC m=+0.084716137 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:01:51 compute-0 podman[213027]: 2025-11-22 10:01:51.846449 +0000 UTC m=+0.166100866 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 10:01:57 compute-0 podman[213053]: 2025-11-22 10:01:57.604350748 +0000 UTC m=+0.056540514 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:01:57 compute-0 podman[213054]: 2025-11-22 10:01:57.611925853 +0000 UTC m=+0.061054867 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:02:01 compute-0 podman[213095]: 2025-11-22 10:02:01.589446276 +0000 UTC m=+0.048538228 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:02:01 compute-0 podman[213096]: 2025-11-22 10:02:01.6043722 +0000 UTC m=+0.060545423 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:02:06 compute-0 nova_compute[186981]: 2025-11-22 10:02:06.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:06 compute-0 nova_compute[186981]: 2025-11-22 10:02:06.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 10:02:06 compute-0 nova_compute[186981]: 2025-11-22 10:02:06.694 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 10:02:06 compute-0 nova_compute[186981]: 2025-11-22 10:02:06.695 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:06 compute-0 nova_compute[186981]: 2025-11-22 10:02:06.695 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 10:02:06 compute-0 nova_compute[186981]: 2025-11-22 10:02:06.774 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:08 compute-0 nova_compute[186981]: 2025-11-22 10:02:08.832 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:08 compute-0 nova_compute[186981]: 2025-11-22 10:02:08.833 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:02:08 compute-0 nova_compute[186981]: 2025-11-22 10:02:08.833 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:02:09 compute-0 nova_compute[186981]: 2025-11-22 10:02:09.046 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:02:09 compute-0 nova_compute[186981]: 2025-11-22 10:02:09.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:09 compute-0 nova_compute[186981]: 2025-11-22 10:02:09.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:09 compute-0 nova_compute[186981]: 2025-11-22 10:02:09.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.778 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.778 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.778 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.779 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.950 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.951 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6061MB free_disk=73.4967155456543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.952 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:10 compute-0 nova_compute[186981]: 2025-11-22 10:02:10.952 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.549 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.549 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.661 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing inventories for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.733 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating ProviderTree inventory for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.734 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.828 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing aggregate associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.849 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing trait associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE2,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE4A,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.871 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.978 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.980 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:02:11 compute-0 nova_compute[186981]: 2025-11-22 10:02:11.980 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:12 compute-0 nova_compute[186981]: 2025-11-22 10:02:12.975 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:12 compute-0 nova_compute[186981]: 2025-11-22 10:02:12.976 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:12 compute-0 nova_compute[186981]: 2025-11-22 10:02:12.976 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:12 compute-0 nova_compute[186981]: 2025-11-22 10:02:12.976 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:13 compute-0 nova_compute[186981]: 2025-11-22 10:02:13.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:02:14 compute-0 podman[213139]: 2025-11-22 10:02:14.611946786 +0000 UTC m=+0.065220870 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:02:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:02:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:02:17.929 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:02:17.929 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:02:17.930 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:22 compute-0 podman[213163]: 2025-11-22 10:02:22.626991058 +0000 UTC m=+0.076933876 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Nov 22 10:02:22 compute-0 podman[213164]: 2025-11-22 10:02:22.667520968 +0000 UTC m=+0.114076765 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 10:02:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:02:27.724 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:02:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:02:27.726 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:02:28 compute-0 podman[213209]: 2025-11-22 10:02:28.629303634 +0000 UTC m=+0.059279918 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:02:28 compute-0 podman[213208]: 2025-11-22 10:02:28.662502474 +0000 UTC m=+0.091868022 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 10:02:32 compute-0 podman[213250]: 2025-11-22 10:02:32.604415111 +0000 UTC m=+0.055863056 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 10:02:32 compute-0 podman[213249]: 2025-11-22 10:02:32.609501179 +0000 UTC m=+0.061543810 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:02:34 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:02:34.727 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:02:45 compute-0 podman[213291]: 2025-11-22 10:02:45.587823709 +0000 UTC m=+0.048854577 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.245 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.246 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.275 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.390 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.390 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.396 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.396 186985 INFO nova.compute.claims [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.495 186985 DEBUG nova.compute.provider_tree [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.508 186985 DEBUG nova.scheduler.client.report [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.524 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.525 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.575 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.575 186985 DEBUG nova.network.neutron [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.596 186985 INFO nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.616 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.693 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.694 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.695 186985 INFO nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Creating image(s)
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.695 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.696 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.696 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.697 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:51 compute-0 nova_compute[186981]: 2025-11-22 10:02:51.697 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:52 compute-0 nova_compute[186981]: 2025-11-22 10:02:52.575 186985 WARNING oslo_policy.policy [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 22 10:02:52 compute-0 nova_compute[186981]: 2025-11-22 10:02:52.575 186985 WARNING oslo_policy.policy [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 22 10:02:52 compute-0 nova_compute[186981]: 2025-11-22 10:02:52.577 186985 DEBUG nova.policy [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.331 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.382 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.part --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.384 186985 DEBUG nova.virt.images [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] 7f933537-dfd2-407d-a523-ec45187c75fc was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.385 186985 DEBUG nova.privsep.utils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.385 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.part /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.598 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.part /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.converted" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.602 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:53 compute-0 podman[213325]: 2025-11-22 10:02:53.609642725 +0000 UTC m=+0.062763356 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 10:02:53 compute-0 podman[213326]: 2025-11-22 10:02:53.638306748 +0000 UTC m=+0.089126896 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.659 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb.converted --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.660 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:53 compute-0 nova_compute[186981]: 2025-11-22 10:02:53.675 186985 INFO oslo.privsep.daemon [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpdl6uf76s/privsep.sock']
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.300 186985 INFO oslo.privsep.daemon [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Spawned new privsep daemon via rootwrap
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.192 213374 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.195 213374 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.197 213374 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.197 213374 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213374
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.376 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.449 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.450 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.450 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.461 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.525 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.526 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.561 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.562 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.562 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.642 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.644 186985 DEBUG nova.virt.disk.api [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.644 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.697 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.698 186985 DEBUG nova.virt.disk.api [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.699 186985 DEBUG nova.objects.instance [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 2c39aa0d-071b-45a1-9df3-aa0aadadf528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.929 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.930 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Ensure instance console log exists: /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.930 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.930 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:54 compute-0 nova_compute[186981]: 2025-11-22 10:02:54.931 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:55 compute-0 nova_compute[186981]: 2025-11-22 10:02:55.023 186985 DEBUG nova.network.neutron [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Successfully created port: 6465f074-89d4-4e64-b119-166c8af9a08e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.032 186985 DEBUG nova.network.neutron [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Successfully updated port: 6465f074-89d4-4e64-b119-166c8af9a08e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.060 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.060 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.061 186985 DEBUG nova.network.neutron [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.553 186985 DEBUG nova.network.neutron [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.760 186985 DEBUG nova.compute.manager [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-changed-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.761 186985 DEBUG nova.compute.manager [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Refreshing instance network info cache due to event network-changed-6465f074-89d4-4e64-b119-166c8af9a08e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:02:56 compute-0 nova_compute[186981]: 2025-11-22 10:02:56.761 186985 DEBUG oslo_concurrency.lockutils [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.713 186985 DEBUG nova.network.neutron [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updating instance_info_cache with network_info: [{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.764 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.764 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Instance network_info: |[{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.765 186985 DEBUG oslo_concurrency.lockutils [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.765 186985 DEBUG nova.network.neutron [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Refreshing network info cache for port 6465f074-89d4-4e64-b119-166c8af9a08e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.771 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Start _get_guest_xml network_info=[{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.776 186985 WARNING nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.783 186985 DEBUG nova.virt.libvirt.host [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.784 186985 DEBUG nova.virt.libvirt.host [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.789 186985 DEBUG nova.virt.libvirt.host [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.790 186985 DEBUG nova.virt.libvirt.host [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.791 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.791 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.792 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.793 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.793 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.793 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.794 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.794 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.795 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.795 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.795 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.796 186985 DEBUG nova.virt.hardware [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.804 186985 DEBUG nova.privsep.utils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.805 186985 DEBUG nova.virt.libvirt.vif [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-274452189',display_name='tempest-TestNetworkBasicOps-server-274452189',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-274452189',id=1,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPqQNcA3mkBmrzfDPkqcaEj4rLol6lzA6hJX1xi7/BH7qavf/IxLCw6ctGHitrvd0GkDOkwEd5ZIr0N0xpJmzktofgCqz8QqwgTcYI7MEXIdckVheN6EhshMzlhT2jz8Ww==',key_name='tempest-TestNetworkBasicOps-971114335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ps8oimsf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:02:51Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=2c39aa0d-071b-45a1-9df3-aa0aadadf528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.806 186985 DEBUG nova.network.os_vif_util [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.807 186985 DEBUG nova.network.os_vif_util [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:ae:57,bridge_name='br-int',has_traffic_filtering=True,id=6465f074-89d4-4e64-b119-166c8af9a08e,network=Network(8e9e0707-a3e1-46b9-90a3-9a4c8f606339),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6465f074-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.810 186985 DEBUG nova.objects.instance [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2c39aa0d-071b-45a1-9df3-aa0aadadf528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.863 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <uuid>2c39aa0d-071b-45a1-9df3-aa0aadadf528</uuid>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <name>instance-00000001</name>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-274452189</nova:name>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:02:57</nova:creationTime>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         <nova:port uuid="6465f074-89d4-4e64-b119-166c8af9a08e">
Nov 22 10:02:57 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <system>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <entry name="serial">2c39aa0d-071b-45a1-9df3-aa0aadadf528</entry>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <entry name="uuid">2c39aa0d-071b-45a1-9df3-aa0aadadf528</entry>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </system>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <os>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   </os>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <features>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   </features>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk.config"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:f8:ae:57"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <target dev="tap6465f074-89"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/console.log" append="off"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <video>
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </video>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:02:57 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:02:57 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:02:57 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:02:57 compute-0 nova_compute[186981]: </domain>
Nov 22 10:02:57 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.864 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Preparing to wait for external event network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.865 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.865 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.865 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.865 186985 DEBUG nova.virt.libvirt.vif [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-274452189',display_name='tempest-TestNetworkBasicOps-server-274452189',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-274452189',id=1,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPqQNcA3mkBmrzfDPkqcaEj4rLol6lzA6hJX1xi7/BH7qavf/IxLCw6ctGHitrvd0GkDOkwEd5ZIr0N0xpJmzktofgCqz8QqwgTcYI7MEXIdckVheN6EhshMzlhT2jz8Ww==',key_name='tempest-TestNetworkBasicOps-971114335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ps8oimsf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:02:51Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=2c39aa0d-071b-45a1-9df3-aa0aadadf528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.866 186985 DEBUG nova.network.os_vif_util [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.866 186985 DEBUG nova.network.os_vif_util [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:ae:57,bridge_name='br-int',has_traffic_filtering=True,id=6465f074-89d4-4e64-b119-166c8af9a08e,network=Network(8e9e0707-a3e1-46b9-90a3-9a4c8f606339),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6465f074-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.866 186985 DEBUG os_vif [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:ae:57,bridge_name='br-int',has_traffic_filtering=True,id=6465f074-89d4-4e64-b119-166c8af9a08e,network=Network(8e9e0707-a3e1-46b9-90a3-9a4c8f606339),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6465f074-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.895 186985 DEBUG ovsdbapp.backend.ovs_idl [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.896 186985 DEBUG ovsdbapp.backend.ovs_idl [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.896 186985 DEBUG ovsdbapp.backend.ovs_idl [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.896 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.897 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.897 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.897 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.898 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.900 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.908 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.908 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.908 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:02:57 compute-0 nova_compute[186981]: 2025-11-22 10:02:57.909 186985 INFO oslo.privsep.daemon [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp3086egwn/privsep.sock']
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.588 186985 INFO oslo.privsep.daemon [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Spawned new privsep daemon via rootwrap
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.486 213395 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.490 213395 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.492 213395 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.492 213395 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213395
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.892 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.892 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6465f074-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.893 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6465f074-89, col_values=(('external_ids', {'iface-id': '6465f074-89d4-4e64-b119-166c8af9a08e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:ae:57', 'vm-uuid': '2c39aa0d-071b-45a1-9df3-aa0aadadf528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:02:58 compute-0 NetworkManager[55425]: <info>  [1763805778.8952] manager: (tap6465f074-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.897 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.900 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:02:58 compute-0 nova_compute[186981]: 2025-11-22 10:02:58.901 186985 INFO os_vif [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:ae:57,bridge_name='br-int',has_traffic_filtering=True,id=6465f074-89d4-4e64-b119-166c8af9a08e,network=Network(8e9e0707-a3e1-46b9-90a3-9a4c8f606339),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6465f074-89')
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.023 186985 DEBUG nova.network.neutron [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updated VIF entry in instance network info cache for port 6465f074-89d4-4e64-b119-166c8af9a08e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.023 186985 DEBUG nova.network.neutron [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updating instance_info_cache with network_info: [{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.056 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.056 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.056 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:f8:ae:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.057 186985 INFO nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Using config drive
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.409 186985 DEBUG oslo_concurrency.lockutils [req-81297cff-362c-415c-976b-c3c0a977d7d2 req-3767d50e-a655-4db7-8082-6a7b348f4159 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:02:59 compute-0 podman[213402]: 2025-11-22 10:02:59.615011099 +0000 UTC m=+0.062525379 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64)
Nov 22 10:02:59 compute-0 podman[213401]: 2025-11-22 10:02:59.639453918 +0000 UTC m=+0.089865738 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 10:02:59 compute-0 nova_compute[186981]: 2025-11-22 10:02:59.805 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:01 compute-0 nova_compute[186981]: 2025-11-22 10:03:01.602 186985 INFO nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Creating config drive at /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk.config
Nov 22 10:03:01 compute-0 nova_compute[186981]: 2025-11-22 10:03:01.608 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5xd9ta4c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:01 compute-0 nova_compute[186981]: 2025-11-22 10:03:01.732 186985 DEBUG oslo_concurrency.processutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5xd9ta4c" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:01 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 22 10:03:01 compute-0 kernel: tap6465f074-89: entered promiscuous mode
Nov 22 10:03:01 compute-0 NetworkManager[55425]: <info>  [1763805781.8492] manager: (tap6465f074-89): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Nov 22 10:03:01 compute-0 systemd-udevd[213459]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:03:01 compute-0 NetworkManager[55425]: <info>  [1763805781.8832] device (tap6465f074-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:03:01 compute-0 NetworkManager[55425]: <info>  [1763805781.8839] device (tap6465f074-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:03:01 compute-0 ovn_controller[95329]: 2025-11-22T10:03:01Z|00027|binding|INFO|Claiming lport 6465f074-89d4-4e64-b119-166c8af9a08e for this chassis.
Nov 22 10:03:01 compute-0 ovn_controller[95329]: 2025-11-22T10:03:01Z|00028|binding|INFO|6465f074-89d4-4e64-b119-166c8af9a08e: Claiming fa:16:3e:f8:ae:57 10.100.0.7
Nov 22 10:03:01 compute-0 nova_compute[186981]: 2025-11-22 10:03:01.909 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:01 compute-0 nova_compute[186981]: 2025-11-22 10:03:01.911 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:01.993 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ae:57 10.100.0.7'], port_security=['fa:16:3e:f8:ae:57 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2c39aa0d-071b-45a1-9df3-aa0aadadf528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b79e302f-0fd7-4a4d-968d-0a04ab688694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab26dd7a-ff98-4f9a-83f2-1180a5ec2aae, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=6465f074-89d4-4e64-b119-166c8af9a08e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:03:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:01.995 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 6465f074-89d4-4e64-b119-166c8af9a08e in datapath 8e9e0707-a3e1-46b9-90a3-9a4c8f606339 bound to our chassis
Nov 22 10:03:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:01.996 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e9e0707-a3e1-46b9-90a3-9a4c8f606339
Nov 22 10:03:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:01.997 104216 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpm8xtj43n/privsep.sock']
Nov 22 10:03:01 compute-0 systemd-machined[153303]: New machine qemu-1-instance-00000001.
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.015 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:02 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 22 10:03:02 compute-0 ovn_controller[95329]: 2025-11-22T10:03:02Z|00029|binding|INFO|Setting lport 6465f074-89d4-4e64-b119-166c8af9a08e ovn-installed in OVS
Nov 22 10:03:02 compute-0 ovn_controller[95329]: 2025-11-22T10:03:02Z|00030|binding|INFO|Setting lport 6465f074-89d4-4e64-b119-166c8af9a08e up in Southbound
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.021 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.356 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805782.3551924, 2c39aa0d-071b-45a1-9df3-aa0aadadf528 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.356 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] VM Started (Lifecycle Event)
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.512 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.516 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805782.3552957, 2c39aa0d-071b-45a1-9df3-aa0aadadf528 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.516 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] VM Paused (Lifecycle Event)
Nov 22 10:03:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:02.640 104216 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 22 10:03:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:02.642 104216 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpm8xtj43n/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 22 10:03:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:02.532 213484 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 10:03:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:02.540 213484 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 10:03:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:02.544 213484 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 22 10:03:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:02.544 213484 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213484
Nov 22 10:03:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:02.645 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[00f4a26c-09f5-4118-8a2a-4bfcfdb0e3b8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.679 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.684 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:03:02 compute-0 nova_compute[186981]: 2025-11-22 10:03:02.810 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.151 213484 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.151 213484 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.151 213484 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.329 186985 DEBUG nova.compute.manager [req-23fb48a5-ddd7-48a8-84d2-ef2059c1ac72 req-0ed3ea1e-3bcf-4e9a-81af-725c5d4ef193 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.330 186985 DEBUG oslo_concurrency.lockutils [req-23fb48a5-ddd7-48a8-84d2-ef2059c1ac72 req-0ed3ea1e-3bcf-4e9a-81af-725c5d4ef193 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.330 186985 DEBUG oslo_concurrency.lockutils [req-23fb48a5-ddd7-48a8-84d2-ef2059c1ac72 req-0ed3ea1e-3bcf-4e9a-81af-725c5d4ef193 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.330 186985 DEBUG oslo_concurrency.lockutils [req-23fb48a5-ddd7-48a8-84d2-ef2059c1ac72 req-0ed3ea1e-3bcf-4e9a-81af-725c5d4ef193 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.331 186985 DEBUG nova.compute.manager [req-23fb48a5-ddd7-48a8-84d2-ef2059c1ac72 req-0ed3ea1e-3bcf-4e9a-81af-725c5d4ef193 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Processing event network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.331 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.335 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805783.3346593, 2c39aa0d-071b-45a1-9df3-aa0aadadf528 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.335 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] VM Resumed (Lifecycle Event)
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.338 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.342 186985 INFO nova.virt.libvirt.driver [-] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Instance spawned successfully.
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.342 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:03:03 compute-0 podman[213489]: 2025-11-22 10:03:03.61046438 +0000 UTC m=+0.059436275 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:03:03 compute-0 podman[213490]: 2025-11-22 10:03:03.611349174 +0000 UTC m=+0.057715148 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.684 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d28a0b1e-16d7-418f-88fb-23f2f868f1d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.685 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e9e0707-a1 in ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.687 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e9e0707-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.687 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[80e2efd2-8b77-407c-954b-862da98793dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.689 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[93caa535-3b46-499d-9e2c-b84537666aec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.711 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[28de8d23-0cf2-4ad1-8782-506d82beac8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.723 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8841949e-7b68-49a1-99dd-6f6eff6e1722]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:03 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:03.724 104216 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmphlxrlpn7/privsep.sock']
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.848 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.853 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.862 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.863 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.865 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.866 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.867 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.867 186985 DEBUG nova.virt.libvirt.driver [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.895 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:03 compute-0 nova_compute[186981]: 2025-11-22 10:03:03.969 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:03:04 compute-0 nova_compute[186981]: 2025-11-22 10:03:04.196 186985 INFO nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Took 12.50 seconds to spawn the instance on the hypervisor.
Nov 22 10:03:04 compute-0 nova_compute[186981]: 2025-11-22 10:03:04.198 186985 DEBUG nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:04 compute-0 nova_compute[186981]: 2025-11-22 10:03:04.301 186985 INFO nova.compute.manager [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Took 12.94 seconds to build instance.
Nov 22 10:03:04 compute-0 nova_compute[186981]: 2025-11-22 10:03:04.381 186985 DEBUG oslo_concurrency.lockutils [None req-846f4e59-963d-48c6-9a93-0042e5b8ab81 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.435 104216 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.437 104216 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmphlxrlpn7/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.307 213545 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.316 213545 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.322 213545 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.323 213545 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213545
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.440 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[71b17879-7c4a-4d46-b2e7-ce6d93198f9c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:04 compute-0 nova_compute[186981]: 2025-11-22 10:03:04.810 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.919 213545 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.919 213545 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:04 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:04.919 213545 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.443 186985 DEBUG nova.compute.manager [req-f958099a-56c6-4c25-9e12-6ca0e13c2767 req-f9221889-cdc4-4295-8e06-94133b99acd0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.444 186985 DEBUG oslo_concurrency.lockutils [req-f958099a-56c6-4c25-9e12-6ca0e13c2767 req-f9221889-cdc4-4295-8e06-94133b99acd0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.444 186985 DEBUG oslo_concurrency.lockutils [req-f958099a-56c6-4c25-9e12-6ca0e13c2767 req-f9221889-cdc4-4295-8e06-94133b99acd0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.445 186985 DEBUG oslo_concurrency.lockutils [req-f958099a-56c6-4c25-9e12-6ca0e13c2767 req-f9221889-cdc4-4295-8e06-94133b99acd0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.445 186985 DEBUG nova.compute.manager [req-f958099a-56c6-4c25-9e12-6ca0e13c2767 req-f9221889-cdc4-4295-8e06-94133b99acd0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] No waiting events found dispatching network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.445 186985 WARNING nova.compute.manager [req-f958099a-56c6-4c25-9e12-6ca0e13c2767 req-f9221889-cdc4-4295-8e06-94133b99acd0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received unexpected event network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e for instance with vm_state active and task_state None.
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.490 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f8ba6b-9d16-4328-b008-8526b26d592a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.509 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[ab9b4d11-faa8-4ac8-b2bf-fa8e2946ed09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 NetworkManager[55425]: <info>  [1763805785.5117] manager: (tap8e9e0707-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Nov 22 10:03:05 compute-0 systemd-udevd[213557]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.546 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8a9f77-caa9-41b4-b504-4c643837fba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.551 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[f37fe8d2-8fff-44e5-876f-314f541375b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 NetworkManager[55425]: <info>  [1763805785.5709] device (tap8e9e0707-a0): carrier: link connected
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.576 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[26cddc2f-4bfe-4350-9a5b-f578c10c6842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.590 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5d596efd-5557-44d4-87ce-9be4e2d50962]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e9e0707-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:d8:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 324327, 'reachable_time': 20253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213575, 'error': None, 'target': 'ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.605 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[38df032e-ee43-458a-801f-3a9e5a89ef4e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:d8f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 324327, 'tstamp': 324327}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213576, 'error': None, 'target': 'ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.618 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[bccb11bc-45be-4e9d-b4f5-13573c62edb9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e9e0707-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:d8:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 324327, 'reachable_time': 20253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213577, 'error': None, 'target': 'ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.641 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f472ba02-6ed4-4784-b83c-4573d371da86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.700 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[76062b70-3525-4b7d-b1ff-890ac70e7a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.702 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e9e0707-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.703 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.705 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e9e0707-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.707 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:05 compute-0 NetworkManager[55425]: <info>  [1763805785.7083] manager: (tap8e9e0707-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 22 10:03:05 compute-0 kernel: tap8e9e0707-a0: entered promiscuous mode
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.710 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.716 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e9e0707-a0, col_values=(('external_ids', {'iface-id': 'c7b7efd9-65c4-4838-801c-c0373e9998a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.718 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:05 compute-0 ovn_controller[95329]: 2025-11-22T10:03:05Z|00031|binding|INFO|Releasing lport c7b7efd9-65c4-4838-801c-c0373e9998a3 from this chassis (sb_readonly=0)
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.719 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.722 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e9e0707-a3e1-46b9-90a3-9a4c8f606339.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e9e0707-a3e1-46b9-90a3-9a4c8f606339.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.739 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad37abc-aa60-4812-b4c7-d542bbce832c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.740 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-8e9e0707-a3e1-46b9-90a3-9a4c8f606339
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/8e9e0707-a3e1-46b9-90a3-9a4c8f606339.pid.haproxy
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID 8e9e0707-a3e1-46b9-90a3-9a4c8f606339
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:03:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:05.741 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'env', 'PROCESS_TAG=haproxy-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e9e0707-a3e1-46b9-90a3-9a4c8f606339.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:03:05 compute-0 nova_compute[186981]: 2025-11-22 10:03:05.744 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:06 compute-0 podman[213610]: 2025-11-22 10:03:06.151615963 +0000 UTC m=+0.078712362 container create 2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 10:03:06 compute-0 podman[213610]: 2025-11-22 10:03:06.094990716 +0000 UTC m=+0.022087125 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:03:06 compute-0 systemd[1]: Started libpod-conmon-2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d.scope.
Nov 22 10:03:06 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:03:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad5963f74a7478bdaf357daf4944f9c9788848581203ddc13f7413b3c82957ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:03:06 compute-0 podman[213610]: 2025-11-22 10:03:06.262606777 +0000 UTC m=+0.189703186 container init 2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 10:03:06 compute-0 podman[213610]: 2025-11-22 10:03:06.272284751 +0000 UTC m=+0.199381130 container start 2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 10:03:06 compute-0 neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339[213626]: [NOTICE]   (213630) : New worker (213632) forked
Nov 22 10:03:06 compute-0 neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339[213626]: [NOTICE]   (213630) : Loading success.
Nov 22 10:03:08 compute-0 ovn_controller[95329]: 2025-11-22T10:03:08Z|00032|binding|INFO|Releasing lport c7b7efd9-65c4-4838-801c-c0373e9998a3 from this chassis (sb_readonly=0)
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4533] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4538] device (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 10:03:08 compute-0 nova_compute[186981]: 2025-11-22 10:03:08.452 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4550] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4553] device (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4560] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4566] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4569] device (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 10:03:08 compute-0 NetworkManager[55425]: <info>  [1763805788.4573] device (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 10:03:08 compute-0 ovn_controller[95329]: 2025-11-22T10:03:08Z|00033|binding|INFO|Releasing lport c7b7efd9-65c4-4838-801c-c0373e9998a3 from this chassis (sb_readonly=0)
Nov 22 10:03:08 compute-0 nova_compute[186981]: 2025-11-22 10:03:08.479 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:08 compute-0 nova_compute[186981]: 2025-11-22 10:03:08.483 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:08 compute-0 nova_compute[186981]: 2025-11-22 10:03:08.897 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.134 186985 DEBUG nova.compute.manager [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-changed-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.135 186985 DEBUG nova.compute.manager [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Refreshing instance network info cache due to event network-changed-6465f074-89d4-4e64-b119-166c8af9a08e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.135 186985 DEBUG oslo_concurrency.lockutils [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.136 186985 DEBUG oslo_concurrency.lockutils [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.136 186985 DEBUG nova.network.neutron [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Refreshing network info cache for port 6465f074-89d4-4e64-b119-166c8af9a08e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.769 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:03:09 compute-0 nova_compute[186981]: 2025-11-22 10:03:09.814 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:10 compute-0 nova_compute[186981]: 2025-11-22 10:03:10.565 186985 DEBUG nova.network.neutron [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updated VIF entry in instance network info cache for port 6465f074-89d4-4e64-b119-166c8af9a08e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:03:10 compute-0 nova_compute[186981]: 2025-11-22 10:03:10.566 186985 DEBUG nova.network.neutron [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updating instance_info_cache with network_info: [{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:03:10 compute-0 nova_compute[186981]: 2025-11-22 10:03:10.750 186985 DEBUG oslo_concurrency.lockutils [req-58a3eb6b-06a1-4f2f-994c-ca7f42fc7e05 req-c5530272-4ad0-4e53-b977-276fead6e33d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:03:10 compute-0 nova_compute[186981]: 2025-11-22 10:03:10.752 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquired lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:03:10 compute-0 nova_compute[186981]: 2025-11-22 10:03:10.752 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 10:03:10 compute-0 nova_compute[186981]: 2025-11-22 10:03:10.752 186985 DEBUG nova.objects.instance [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2c39aa0d-071b-45a1-9df3-aa0aadadf528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:03:13 compute-0 nova_compute[186981]: 2025-11-22 10:03:13.898 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.734 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updating instance_info_cache with network_info: [{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.755 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Releasing lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.755 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.756 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.757 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.757 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.757 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.758 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.758 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.759 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.785 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.786 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.786 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.786 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.881 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.905 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.986 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:14 compute-0 nova_compute[186981]: 2025-11-22 10:03:14.989 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.057 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.218 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.219 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5577MB free_disk=73.46147537231445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.220 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.220 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.299 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance 2c39aa0d-071b-45a1-9df3-aa0aadadf528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.299 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.300 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.345 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.376 186985 ERROR nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [req-d84307fc-14e7-4c0c-978a-9211efb0ba5b] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID dd02da68-d6c7-4f1a-8710-21abb7ad1703.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-d84307fc-14e7-4c0c-978a-9211efb0ba5b"}]}
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.394 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing inventories for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.417 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating ProviderTree inventory for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.418 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.436 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing aggregate associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.462 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing trait associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE2,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE4A,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.499 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.542 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updated inventory for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.543 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.544 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.570 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:03:15 compute-0 nova_compute[186981]: 2025-11-22 10:03:15.571 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:16 compute-0 ovn_controller[95329]: 2025-11-22T10:03:16Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f8:ae:57 10.100.0.7
Nov 22 10:03:16 compute-0 ovn_controller[95329]: 2025-11-22T10:03:16Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:ae:57 10.100.0.7
Nov 22 10:03:16 compute-0 nova_compute[186981]: 2025-11-22 10:03:16.409 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:16 compute-0 nova_compute[186981]: 2025-11-22 10:03:16.411 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:16 compute-0 nova_compute[186981]: 2025-11-22 10:03:16.445 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:03:16 compute-0 podman[213668]: 2025-11-22 10:03:16.630880485 +0000 UTC m=+0.081010455 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:03:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:17.930 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:17.931 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:17.932 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:18 compute-0 nova_compute[186981]: 2025-11-22 10:03:18.901 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:19 compute-0 nova_compute[186981]: 2025-11-22 10:03:19.879 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:21 compute-0 nova_compute[186981]: 2025-11-22 10:03:21.456 186985 INFO nova.compute.manager [None req-2d54d9d8-5448-4630-b0bb-1802b5ed33f3 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Get console output
Nov 22 10:03:21 compute-0 nova_compute[186981]: 2025-11-22 10:03:21.545 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:03:23 compute-0 nova_compute[186981]: 2025-11-22 10:03:23.902 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:24 compute-0 podman[213692]: 2025-11-22 10:03:24.617675145 +0000 UTC m=+0.074811226 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:03:24 compute-0 podman[213693]: 2025-11-22 10:03:24.674084616 +0000 UTC m=+0.120003561 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 10:03:24 compute-0 nova_compute[186981]: 2025-11-22 10:03:24.918 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:28 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:28.203 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:03:28 compute-0 nova_compute[186981]: 2025-11-22 10:03:28.204 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:28 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:28.205 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:03:28 compute-0 nova_compute[186981]: 2025-11-22 10:03:28.906 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:29 compute-0 nova_compute[186981]: 2025-11-22 10:03:29.921 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:30.208 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:30 compute-0 podman[213737]: 2025-11-22 10:03:30.61173792 +0000 UTC m=+0.062106130 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 10:03:30 compute-0 podman[213738]: 2025-11-22 10:03:30.627244873 +0000 UTC m=+0.077105478 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 10:03:33 compute-0 nova_compute[186981]: 2025-11-22 10:03:33.908 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:34 compute-0 podman[213778]: 2025-11-22 10:03:34.599083839 +0000 UTC m=+0.052345842 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:03:34 compute-0 podman[213777]: 2025-11-22 10:03:34.599232523 +0000 UTC m=+0.057506033 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 10:03:34 compute-0 nova_compute[186981]: 2025-11-22 10:03:34.923 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:35 compute-0 nova_compute[186981]: 2025-11-22 10:03:35.936 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:35 compute-0 nova_compute[186981]: 2025-11-22 10:03:35.936 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:35 compute-0 nova_compute[186981]: 2025-11-22 10:03:35.955 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.064 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.065 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.078 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.080 186985 INFO nova.compute.claims [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.239 186985 DEBUG nova.compute.provider_tree [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.254 186985 DEBUG nova.scheduler.client.report [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.275 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.276 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.329 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.330 186985 DEBUG nova.network.neutron [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.355 186985 INFO nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.383 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.489 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.491 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.491 186985 INFO nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Creating image(s)
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.492 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.493 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.493 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.509 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.574 186985 DEBUG nova.policy [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.578 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.579 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.579 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.595 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.647 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.648 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.685 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.686 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.687 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.749 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.751 186985 DEBUG nova.virt.disk.api [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.751 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.813 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.814 186985 DEBUG nova.virt.disk.api [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.814 186985 DEBUG nova.objects.instance [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid f541d5d7-2ed0-40b4-a4bb-46f142461bc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.828 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.828 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Ensure instance console log exists: /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.829 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.829 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:36 compute-0 nova_compute[186981]: 2025-11-22 10:03:36.830 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.130 186985 DEBUG nova.network.neutron [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Successfully created port: cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.772 186985 DEBUG nova.network.neutron [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Successfully updated port: cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.792 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-f541d5d7-2ed0-40b4-a4bb-46f142461bc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.792 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-f541d5d7-2ed0-40b4-a4bb-46f142461bc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.793 186985 DEBUG nova.network.neutron [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.869 186985 DEBUG nova.compute.manager [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-changed-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.870 186985 DEBUG nova.compute.manager [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Refreshing instance network info cache due to event network-changed-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.870 186985 DEBUG oslo_concurrency.lockutils [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-f541d5d7-2ed0-40b4-a4bb-46f142461bc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:03:37 compute-0 nova_compute[186981]: 2025-11-22 10:03:37.914 186985 DEBUG nova.network.neutron [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:03:38 compute-0 nova_compute[186981]: 2025-11-22 10:03:38.909 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.363 186985 DEBUG nova.network.neutron [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Updating instance_info_cache with network_info: [{"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.386 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-f541d5d7-2ed0-40b4-a4bb-46f142461bc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.387 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Instance network_info: |[{"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.387 186985 DEBUG oslo_concurrency.lockutils [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-f541d5d7-2ed0-40b4-a4bb-46f142461bc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.388 186985 DEBUG nova.network.neutron [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Refreshing network info cache for port cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.390 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Start _get_guest_xml network_info=[{"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.395 186985 WARNING nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.403 186985 DEBUG nova.virt.libvirt.host [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.404 186985 DEBUG nova.virt.libvirt.host [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.408 186985 DEBUG nova.virt.libvirt.host [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.408 186985 DEBUG nova.virt.libvirt.host [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.408 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.409 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.409 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.409 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.409 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.410 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.410 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.410 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.410 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.410 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.411 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.411 186985 DEBUG nova.virt.hardware [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.414 186985 DEBUG nova.virt.libvirt.vif [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:03:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1453709900',display_name='tempest-TestNetworkBasicOps-server-1453709900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1453709900',id=2,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEBpINLp6gWT+QJpu95ARGbWlA0OGNpFBlrLO60K2ydGHt8Q0TTl/8LI1XM2IC5lLNaIsIbohDMDiG82tgKLOqsGP5NI5jPj3j/n1FBb6jdJeSvDGHewLls2NUd0V8Zctg==',key_name='tempest-TestNetworkBasicOps-1757194652',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-9jh4vdgq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:03:36Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=f541d5d7-2ed0-40b4-a4bb-46f142461bc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.414 186985 DEBUG nova.network.os_vif_util [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.415 186985 DEBUG nova.network.os_vif_util [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:0d:0b,bridge_name='br-int',has_traffic_filtering=True,id=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86,network=Network(f8986924-41de-469e-8ec3-fd34abaf31ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebbfcc7-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.416 186985 DEBUG nova.objects.instance [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid f541d5d7-2ed0-40b4-a4bb-46f142461bc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.433 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <uuid>f541d5d7-2ed0-40b4-a4bb-46f142461bc2</uuid>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <name>instance-00000002</name>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-1453709900</nova:name>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:03:39</nova:creationTime>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         <nova:port uuid="cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86">
Nov 22 10:03:39 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <system>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <entry name="serial">f541d5d7-2ed0-40b4-a4bb-46f142461bc2</entry>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <entry name="uuid">f541d5d7-2ed0-40b4-a4bb-46f142461bc2</entry>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </system>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <os>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   </os>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <features>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   </features>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk.config"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:83:0d:0b"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <target dev="tapcebbfcc7-6c"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/console.log" append="off"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <video>
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </video>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:03:39 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:03:39 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:03:39 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:03:39 compute-0 nova_compute[186981]: </domain>
Nov 22 10:03:39 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.434 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Preparing to wait for external event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.434 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.435 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.435 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.435 186985 DEBUG nova.virt.libvirt.vif [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:03:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1453709900',display_name='tempest-TestNetworkBasicOps-server-1453709900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1453709900',id=2,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEBpINLp6gWT+QJpu95ARGbWlA0OGNpFBlrLO60K2ydGHt8Q0TTl/8LI1XM2IC5lLNaIsIbohDMDiG82tgKLOqsGP5NI5jPj3j/n1FBb6jdJeSvDGHewLls2NUd0V8Zctg==',key_name='tempest-TestNetworkBasicOps-1757194652',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-9jh4vdgq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:03:36Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=f541d5d7-2ed0-40b4-a4bb-46f142461bc2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.436 186985 DEBUG nova.network.os_vif_util [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.436 186985 DEBUG nova.network.os_vif_util [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:0d:0b,bridge_name='br-int',has_traffic_filtering=True,id=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86,network=Network(f8986924-41de-469e-8ec3-fd34abaf31ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebbfcc7-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.437 186985 DEBUG os_vif [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0d:0b,bridge_name='br-int',has_traffic_filtering=True,id=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86,network=Network(f8986924-41de-469e-8ec3-fd34abaf31ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebbfcc7-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.437 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.437 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.438 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.441 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.442 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcebbfcc7-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.442 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcebbfcc7-6c, col_values=(('external_ids', {'iface-id': 'cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:0d:0b', 'vm-uuid': 'f541d5d7-2ed0-40b4-a4bb-46f142461bc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.486 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:39 compute-0 NetworkManager[55425]: <info>  [1763805819.4870] manager: (tapcebbfcc7-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.488 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.491 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.492 186985 INFO os_vif [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0d:0b,bridge_name='br-int',has_traffic_filtering=True,id=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86,network=Network(f8986924-41de-469e-8ec3-fd34abaf31ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebbfcc7-6c')
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.536 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.537 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.537 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:83:0d:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.538 186985 INFO nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Using config drive
Nov 22 10:03:39 compute-0 nova_compute[186981]: 2025-11-22 10:03:39.926 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:41 compute-0 nova_compute[186981]: 2025-11-22 10:03:41.593 186985 INFO nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Creating config drive at /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk.config
Nov 22 10:03:41 compute-0 nova_compute[186981]: 2025-11-22 10:03:41.603 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxg0945k8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:03:41 compute-0 nova_compute[186981]: 2025-11-22 10:03:41.744 186985 DEBUG oslo_concurrency.processutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxg0945k8" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:03:41 compute-0 kernel: tapcebbfcc7-6c: entered promiscuous mode
Nov 22 10:03:41 compute-0 NetworkManager[55425]: <info>  [1763805821.8229] manager: (tapcebbfcc7-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Nov 22 10:03:41 compute-0 ovn_controller[95329]: 2025-11-22T10:03:41Z|00034|binding|INFO|Claiming lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 for this chassis.
Nov 22 10:03:41 compute-0 ovn_controller[95329]: 2025-11-22T10:03:41Z|00035|binding|INFO|cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86: Claiming fa:16:3e:83:0d:0b 10.100.0.23
Nov 22 10:03:41 compute-0 nova_compute[186981]: 2025-11-22 10:03:41.868 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:41 compute-0 nova_compute[186981]: 2025-11-22 10:03:41.871 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:41 compute-0 systemd-udevd[213852]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:03:41 compute-0 NetworkManager[55425]: <info>  [1763805821.8961] device (tapcebbfcc7-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:03:41 compute-0 NetworkManager[55425]: <info>  [1763805821.8985] device (tapcebbfcc7-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:03:41 compute-0 ovn_controller[95329]: 2025-11-22T10:03:41Z|00036|binding|INFO|Setting lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 ovn-installed in OVS
Nov 22 10:03:41 compute-0 nova_compute[186981]: 2025-11-22 10:03:41.908 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:41 compute-0 systemd-machined[153303]: New machine qemu-2-instance-00000002.
Nov 22 10:03:41 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.973 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:0d:0b 10.100.0.23'], port_security=['fa:16:3e:83:0d:0b 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'f541d5d7-2ed0-40b4-a4bb-46f142461bc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8986924-41de-469e-8ec3-fd34abaf31ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f99b97dd-2d64-41a5-ba6a-cb6dba0da796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee9704dc-ab05-42c6-a6ad-2a8284f87cb8, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:03:41 compute-0 ovn_controller[95329]: 2025-11-22T10:03:41Z|00037|binding|INFO|Setting lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 up in Southbound
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.974 104216 INFO neutron.agent.ovn.metadata.agent [-] Port cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 in datapath f8986924-41de-469e-8ec3-fd34abaf31ae bound to our chassis
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.976 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8986924-41de-469e-8ec3-fd34abaf31ae
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.991 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e72e4e6e-5b08-47f8-8aa5-9038f1d60d24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.993 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8986924-41 in ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.995 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8986924-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.995 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2304f83b-fec7-4239-a7cd-51a3e0fa2746]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:41 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:41.996 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[859a58ea-ee63-4a27-8f32-bbd3035b6935]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.026 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[01404c8a-3e84-446a-87e5-0d5b43fbb62a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.046 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[fdefb74f-ae49-4393-b0eb-52c778762f1b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.087 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6fe59b-fc28-41c1-982a-d39756cbb1af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 NetworkManager[55425]: <info>  [1763805822.0959] manager: (tapf8986924-40): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Nov 22 10:03:42 compute-0 systemd-udevd[213856]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.100 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[3a938580-2b23-4ad2-8d14-93a7894717eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.140 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[3065cea2-16b2-4fc8-ab03-cc643fee5cf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.143 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[a91af538-4039-4258-8ba4-c3f94f5d4d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 NetworkManager[55425]: <info>  [1763805822.1736] device (tapf8986924-40): carrier: link connected
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.180 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[4b09440e-60b7-46ef-b102-8f892e297012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.206 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b804df-6d2e-4191-975f-581b21f43cd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8986924-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:55:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327988, 'reachable_time': 42319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213889, 'error': None, 'target': 'ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.228 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3fedfa-8540-4b07-a357-edc228c1ab76]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:55db'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327988, 'tstamp': 327988}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213890, 'error': None, 'target': 'ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.252 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2965fa-aeb2-4439-86e4-d75e32cde039]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8986924-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:55:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327988, 'reachable_time': 42319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213891, 'error': None, 'target': 'ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.288 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[dacbdc37-5c7d-44a3-aa2d-cef3b9714dd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.350 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[83c04f2c-06e5-46b0-82b7-29f0969a18d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.351 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8986924-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.352 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.352 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8986924-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:42 compute-0 NetworkManager[55425]: <info>  [1763805822.3542] manager: (tapf8986924-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 22 10:03:42 compute-0 kernel: tapf8986924-40: entered promiscuous mode
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.353 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.358 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8986924-40, col_values=(('external_ids', {'iface-id': '4802329d-b08a-4be7-8737-08b31cb9ba40'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.359 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:42 compute-0 ovn_controller[95329]: 2025-11-22T10:03:42Z|00038|binding|INFO|Releasing lport 4802329d-b08a-4be7-8737-08b31cb9ba40 from this chassis (sb_readonly=0)
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.361 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8986924-41de-469e-8ec3-fd34abaf31ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8986924-41de-469e-8ec3-fd34abaf31ae.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.362 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[6eba3591-23ef-4b5c-a09f-e50691f931c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.362 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-f8986924-41de-469e-8ec3-fd34abaf31ae
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/f8986924-41de-469e-8ec3-fd34abaf31ae.pid.haproxy
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID f8986924-41de-469e-8ec3-fd34abaf31ae
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:03:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:03:42.363 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae', 'env', 'PROCESS_TAG=haproxy-f8986924-41de-469e-8ec3-fd34abaf31ae', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8986924-41de-469e-8ec3-fd34abaf31ae.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.369 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.538 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805822.5372028, f541d5d7-2ed0-40b4-a4bb-46f142461bc2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.538 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] VM Started (Lifecycle Event)
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.684 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.691 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805822.5377579, f541d5d7-2ed0-40b4-a4bb-46f142461bc2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.691 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] VM Paused (Lifecycle Event)
Nov 22 10:03:42 compute-0 podman[213931]: 2025-11-22 10:03:42.677158733 +0000 UTC m=+0.025069696 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:03:42 compute-0 podman[213931]: 2025-11-22 10:03:42.812660776 +0000 UTC m=+0.160571679 container create 7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.878 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.883 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:03:42 compute-0 systemd[1]: Started libpod-conmon-7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e.scope.
Nov 22 10:03:42 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:03:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abb96df06c9a8ca86643e648ff596331f4af37b5bc1c25156f06cf70c2f3d506/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:03:42 compute-0 podman[213931]: 2025-11-22 10:03:42.940986494 +0000 UTC m=+0.288897357 container init 7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 10:03:42 compute-0 podman[213931]: 2025-11-22 10:03:42.947622895 +0000 UTC m=+0.295533758 container start 7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 10:03:42 compute-0 neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae[213947]: [NOTICE]   (213951) : New worker (213953) forked
Nov 22 10:03:42 compute-0 neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae[213947]: [NOTICE]   (213951) : Loading success.
Nov 22 10:03:42 compute-0 nova_compute[186981]: 2025-11-22 10:03:42.974 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.721 186985 DEBUG nova.compute.manager [req-fcd89e23-e847-4d7a-8241-29834f21c76f req-aebafc97-0260-44b5-af3e-c962edd0bad1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.722 186985 DEBUG oslo_concurrency.lockutils [req-fcd89e23-e847-4d7a-8241-29834f21c76f req-aebafc97-0260-44b5-af3e-c962edd0bad1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.722 186985 DEBUG oslo_concurrency.lockutils [req-fcd89e23-e847-4d7a-8241-29834f21c76f req-aebafc97-0260-44b5-af3e-c962edd0bad1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.722 186985 DEBUG oslo_concurrency.lockutils [req-fcd89e23-e847-4d7a-8241-29834f21c76f req-aebafc97-0260-44b5-af3e-c962edd0bad1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.722 186985 DEBUG nova.compute.manager [req-fcd89e23-e847-4d7a-8241-29834f21c76f req-aebafc97-0260-44b5-af3e-c962edd0bad1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Processing event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.723 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.728 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805823.7278912, f541d5d7-2ed0-40b4-a4bb-46f142461bc2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.728 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] VM Resumed (Lifecycle Event)
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.730 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.734 186985 INFO nova.virt.libvirt.driver [-] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Instance spawned successfully.
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.734 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.927 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.934 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.967 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.968 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.969 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.969 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.969 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:43 compute-0 nova_compute[186981]: 2025-11-22 10:03:43.970 186985 DEBUG nova.virt.libvirt.driver [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:03:44 compute-0 nova_compute[186981]: 2025-11-22 10:03:44.006 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:03:44 compute-0 nova_compute[186981]: 2025-11-22 10:03:44.056 186985 INFO nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Took 7.57 seconds to spawn the instance on the hypervisor.
Nov 22 10:03:44 compute-0 nova_compute[186981]: 2025-11-22 10:03:44.057 186985 DEBUG nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:03:44 compute-0 nova_compute[186981]: 2025-11-22 10:03:44.231 186985 INFO nova.compute.manager [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Took 8.23 seconds to build instance.
Nov 22 10:03:44 compute-0 nova_compute[186981]: 2025-11-22 10:03:44.346 186985 DEBUG oslo_concurrency.lockutils [None req-e4b9e3d0-f239-4f56-a64f-b479a0d21e4d fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:44 compute-0 nova_compute[186981]: 2025-11-22 10:03:44.487 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:44 compute-0 nova_compute[186981]: 2025-11-22 10:03:44.960 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:45 compute-0 nova_compute[186981]: 2025-11-22 10:03:45.580 186985 DEBUG nova.network.neutron [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Updated VIF entry in instance network info cache for port cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:03:45 compute-0 nova_compute[186981]: 2025-11-22 10:03:45.581 186985 DEBUG nova.network.neutron [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Updating instance_info_cache with network_info: [{"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:03:45 compute-0 nova_compute[186981]: 2025-11-22 10:03:45.618 186985 DEBUG oslo_concurrency.lockutils [req-a8888db3-3552-4907-bb31-3b88f1483d77 req-b531749b-5bea-4297-bd0b-1e7b7d2058b0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-f541d5d7-2ed0-40b4-a4bb-46f142461bc2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:03:46 compute-0 nova_compute[186981]: 2025-11-22 10:03:46.113 186985 DEBUG nova.compute.manager [req-3c92f9ed-3848-4e61-b2c9-1bffe43e780f req-d9855f0b-e164-4a8d-bf19-30f3832f4cbb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:03:46 compute-0 nova_compute[186981]: 2025-11-22 10:03:46.114 186985 DEBUG oslo_concurrency.lockutils [req-3c92f9ed-3848-4e61-b2c9-1bffe43e780f req-d9855f0b-e164-4a8d-bf19-30f3832f4cbb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:03:46 compute-0 nova_compute[186981]: 2025-11-22 10:03:46.114 186985 DEBUG oslo_concurrency.lockutils [req-3c92f9ed-3848-4e61-b2c9-1bffe43e780f req-d9855f0b-e164-4a8d-bf19-30f3832f4cbb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:03:46 compute-0 nova_compute[186981]: 2025-11-22 10:03:46.114 186985 DEBUG oslo_concurrency.lockutils [req-3c92f9ed-3848-4e61-b2c9-1bffe43e780f req-d9855f0b-e164-4a8d-bf19-30f3832f4cbb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:03:46 compute-0 nova_compute[186981]: 2025-11-22 10:03:46.115 186985 DEBUG nova.compute.manager [req-3c92f9ed-3848-4e61-b2c9-1bffe43e780f req-d9855f0b-e164-4a8d-bf19-30f3832f4cbb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] No waiting events found dispatching network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:03:46 compute-0 nova_compute[186981]: 2025-11-22 10:03:46.115 186985 WARNING nova.compute.manager [req-3c92f9ed-3848-4e61-b2c9-1bffe43e780f req-d9855f0b-e164-4a8d-bf19-30f3832f4cbb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received unexpected event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 for instance with vm_state active and task_state None.
Nov 22 10:03:47 compute-0 podman[213962]: 2025-11-22 10:03:47.598683724 +0000 UTC m=+0.054888650 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:03:49 compute-0 nova_compute[186981]: 2025-11-22 10:03:49.489 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:49 compute-0 nova_compute[186981]: 2025-11-22 10:03:49.960 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:54 compute-0 nova_compute[186981]: 2025-11-22 10:03:54.492 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:54 compute-0 nova_compute[186981]: 2025-11-22 10:03:54.964 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:55 compute-0 podman[213987]: 2025-11-22 10:03:55.616021389 +0000 UTC m=+0.065434020 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 10:03:55 compute-0 podman[213988]: 2025-11-22 10:03:55.685373375 +0000 UTC m=+0.119696142 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:03:58 compute-0 ovn_controller[95329]: 2025-11-22T10:03:58Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:0d:0b 10.100.0.23
Nov 22 10:03:58 compute-0 ovn_controller[95329]: 2025-11-22T10:03:58Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:0d:0b 10.100.0.23
Nov 22 10:03:59 compute-0 nova_compute[186981]: 2025-11-22 10:03:59.508 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:03:59 compute-0 nova_compute[186981]: 2025-11-22 10:03:59.967 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:01 compute-0 podman[214049]: 2025-11-22 10:04:01.591927516 +0000 UTC m=+0.051183859 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 10:04:01 compute-0 podman[214050]: 2025-11-22 10:04:01.617209041 +0000 UTC m=+0.065804090 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git)
Nov 22 10:04:04 compute-0 nova_compute[186981]: 2025-11-22 10:04:04.511 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:04 compute-0 nova_compute[186981]: 2025-11-22 10:04:04.969 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:05 compute-0 podman[214091]: 2025-11-22 10:04:05.613049269 +0000 UTC m=+0.067029923 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Nov 22 10:04:05 compute-0 podman[214090]: 2025-11-22 10:04:05.620588787 +0000 UTC m=+0.071192079 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.085 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.086 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.087 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.087 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.087 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.089 186985 INFO nova.compute.manager [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Terminating instance
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.091 186985 DEBUG nova.compute.manager [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:04:07 compute-0 kernel: tapcebbfcc7-6c (unregistering): left promiscuous mode
Nov 22 10:04:07 compute-0 NetworkManager[55425]: <info>  [1763805847.1100] device (tapcebbfcc7-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.153 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00039|binding|INFO|Releasing lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 from this chassis (sb_readonly=0)
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00040|binding|INFO|Setting lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 down in Southbound
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00041|binding|INFO|Removing iface tapcebbfcc7-6c ovn-installed in OVS
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.156 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.179 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.186 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:0d:0b 10.100.0.23'], port_security=['fa:16:3e:83:0d:0b 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'f541d5d7-2ed0-40b4-a4bb-46f142461bc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8986924-41de-469e-8ec3-fd34abaf31ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f99b97dd-2d64-41a5-ba6a-cb6dba0da796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee9704dc-ab05-42c6-a6ad-2a8284f87cb8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.188 104216 INFO neutron.agent.ovn.metadata.agent [-] Port cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 in datapath f8986924-41de-469e-8ec3-fd34abaf31ae unbound from our chassis
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.189 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8986924-41de-469e-8ec3-fd34abaf31ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.190 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe24e37-b2f2-4ee1-8a43-05ada0cb1a1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.191 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae namespace which is not needed anymore
Nov 22 10:04:07 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 22 10:04:07 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 14.291s CPU time.
Nov 22 10:04:07 compute-0 systemd-machined[153303]: Machine qemu-2-instance-00000002 terminated.
Nov 22 10:04:07 compute-0 kernel: tapcebbfcc7-6c: entered promiscuous mode
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00042|binding|INFO|Claiming lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 for this chassis.
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00043|binding|INFO|cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86: Claiming fa:16:3e:83:0d:0b 10.100.0.23
Nov 22 10:04:07 compute-0 NetworkManager[55425]: <info>  [1763805847.3202] manager: (tapcebbfcc7-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Nov 22 10:04:07 compute-0 systemd-udevd[214137]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.324 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 kernel: tapcebbfcc7-6c (unregistering): left promiscuous mode
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.334 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:0d:0b 10.100.0.23'], port_security=['fa:16:3e:83:0d:0b 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'f541d5d7-2ed0-40b4-a4bb-46f142461bc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8986924-41de-469e-8ec3-fd34abaf31ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f99b97dd-2d64-41a5-ba6a-cb6dba0da796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee9704dc-ab05-42c6-a6ad-2a8284f87cb8, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00044|binding|INFO|Setting lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 ovn-installed in OVS
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00045|binding|INFO|Setting lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 up in Southbound
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.342 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00046|binding|INFO|Releasing lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 from this chassis (sb_readonly=0)
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00047|binding|INFO|Setting lport cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 down in Southbound
Nov 22 10:04:07 compute-0 ovn_controller[95329]: 2025-11-22T10:04:07Z|00048|binding|INFO|Removing iface tapcebbfcc7-6c ovn-installed in OVS
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.348 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.357 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:0d:0b 10.100.0.23'], port_security=['fa:16:3e:83:0d:0b 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': 'f541d5d7-2ed0-40b4-a4bb-46f142461bc2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8986924-41de-469e-8ec3-fd34abaf31ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f99b97dd-2d64-41a5-ba6a-cb6dba0da796', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee9704dc-ab05-42c6-a6ad-2a8284f87cb8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.365 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae[213947]: [NOTICE]   (213951) : haproxy version is 2.8.14-c23fe91
Nov 22 10:04:07 compute-0 neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae[213947]: [NOTICE]   (213951) : path to executable is /usr/sbin/haproxy
Nov 22 10:04:07 compute-0 neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae[213947]: [WARNING]  (213951) : Exiting Master process...
Nov 22 10:04:07 compute-0 neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae[213947]: [ALERT]    (213951) : Current worker (213953) exited with code 143 (Terminated)
Nov 22 10:04:07 compute-0 neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae[213947]: [WARNING]  (213951) : All workers exited. Exiting... (0)
Nov 22 10:04:07 compute-0 systemd[1]: libpod-7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e.scope: Deactivated successfully.
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.388 186985 INFO nova.virt.libvirt.driver [-] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Instance destroyed successfully.
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.389 186985 DEBUG nova.objects.instance [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid f541d5d7-2ed0-40b4-a4bb-46f142461bc2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:04:07 compute-0 podman[214157]: 2025-11-22 10:04:07.390552618 +0000 UTC m=+0.107036634 container died 7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.409 186985 DEBUG nova.virt.libvirt.vif [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:03:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1453709900',display_name='tempest-TestNetworkBasicOps-server-1453709900',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1453709900',id=2,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEBpINLp6gWT+QJpu95ARGbWlA0OGNpFBlrLO60K2ydGHt8Q0TTl/8LI1XM2IC5lLNaIsIbohDMDiG82tgKLOqsGP5NI5jPj3j/n1FBb6jdJeSvDGHewLls2NUd0V8Zctg==',key_name='tempest-TestNetworkBasicOps-1757194652',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:03:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-9jh4vdgq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:03:44Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=f541d5d7-2ed0-40b4-a4bb-46f142461bc2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.410 186985 DEBUG nova.network.os_vif_util [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "address": "fa:16:3e:83:0d:0b", "network": {"id": "f8986924-41de-469e-8ec3-fd34abaf31ae", "bridge": "br-int", "label": "tempest-network-smoke--616547625", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebbfcc7-6c", "ovs_interfaceid": "cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.411 186985 DEBUG nova.network.os_vif_util [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:0d:0b,bridge_name='br-int',has_traffic_filtering=True,id=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86,network=Network(f8986924-41de-469e-8ec3-fd34abaf31ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebbfcc7-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.411 186985 DEBUG os_vif [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0d:0b,bridge_name='br-int',has_traffic_filtering=True,id=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86,network=Network(f8986924-41de-469e-8ec3-fd34abaf31ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebbfcc7-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.415 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.415 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcebbfcc7-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.417 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.418 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.422 186985 INFO os_vif [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0d:0b,bridge_name='br-int',has_traffic_filtering=True,id=cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86,network=Network(f8986924-41de-469e-8ec3-fd34abaf31ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebbfcc7-6c')
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.423 186985 INFO nova.virt.libvirt.driver [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Deleting instance files /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2_del
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.423 186985 INFO nova.virt.libvirt.driver [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Deletion of /var/lib/nova/instances/f541d5d7-2ed0-40b4-a4bb-46f142461bc2_del complete
Nov 22 10:04:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e-userdata-shm.mount: Deactivated successfully.
Nov 22 10:04:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-abb96df06c9a8ca86643e648ff596331f4af37b5bc1c25156f06cf70c2f3d506-merged.mount: Deactivated successfully.
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.496 186985 DEBUG nova.virt.libvirt.host [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.497 186985 INFO nova.virt.libvirt.host [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] UEFI support detected
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.498 186985 INFO nova.compute.manager [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.499 186985 DEBUG oslo.service.loopingcall [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.499 186985 DEBUG nova.compute.manager [-] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.499 186985 DEBUG nova.network.neutron [-] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:04:07 compute-0 podman[214157]: 2025-11-22 10:04:07.502477385 +0000 UTC m=+0.218961391 container cleanup 7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:04:07 compute-0 systemd[1]: libpod-conmon-7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e.scope: Deactivated successfully.
Nov 22 10:04:07 compute-0 podman[214198]: 2025-11-22 10:04:07.574354252 +0000 UTC m=+0.047092606 container remove 7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.580 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[11952203-9c78-404f-8e78-9d7d847a7039]: (4, ('Sat Nov 22 10:04:07 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae (7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e)\n7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e\nSat Nov 22 10:04:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae (7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e)\n7c5830630c80c8d19b294dbaece23459a24f854d004d4512d43182b2c0a8453e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.582 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f0437ead-a603-4846-9032-7343f4e3175c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.583 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8986924-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.584 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 kernel: tapf8986924-40: left promiscuous mode
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.586 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.589 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[6966eda2-6765-42c7-90f2-591b2551b492]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.596 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.611 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[980c2aac-fa4c-4565-9014-1be3f90253e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.612 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[857fa975-22e9-4f1a-bf83-65646f96863d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.632 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ca2eb7-8ce1-445c-84f1-bd3546323d06]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327978, 'reachable_time': 21390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214213, 'error': None, 'target': 'ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 systemd[1]: run-netns-ovnmeta\x2df8986924\x2d41de\x2d469e\x2d8ec3\x2dfd34abaf31ae.mount: Deactivated successfully.
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.641 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8986924-41de-469e-8ec3-fd34abaf31ae deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.642 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[33418ece-6f91-4c11-b773-aa31921c6ad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.644 104216 INFO neutron.agent.ovn.metadata.agent [-] Port cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 in datapath f8986924-41de-469e-8ec3-fd34abaf31ae unbound from our chassis
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.645 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8986924-41de-469e-8ec3-fd34abaf31ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.646 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2ecfafee-601b-46bb-9069-58ff420cc9e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.647 104216 INFO neutron.agent.ovn.metadata.agent [-] Port cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 in datapath f8986924-41de-469e-8ec3-fd34abaf31ae unbound from our chassis
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.648 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8986924-41de-469e-8ec3-fd34abaf31ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:04:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:07.649 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7f57ff-f878-4362-baee-f6c8abdd2260]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.731 186985 DEBUG nova.compute.manager [req-099afa2e-b415-432f-94f3-69478985f665 req-b08441c0-3c81-4734-9d94-fc98079db0f6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-vif-unplugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.731 186985 DEBUG oslo_concurrency.lockutils [req-099afa2e-b415-432f-94f3-69478985f665 req-b08441c0-3c81-4734-9d94-fc98079db0f6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.731 186985 DEBUG oslo_concurrency.lockutils [req-099afa2e-b415-432f-94f3-69478985f665 req-b08441c0-3c81-4734-9d94-fc98079db0f6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.732 186985 DEBUG oslo_concurrency.lockutils [req-099afa2e-b415-432f-94f3-69478985f665 req-b08441c0-3c81-4734-9d94-fc98079db0f6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.732 186985 DEBUG nova.compute.manager [req-099afa2e-b415-432f-94f3-69478985f665 req-b08441c0-3c81-4734-9d94-fc98079db0f6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] No waiting events found dispatching network-vif-unplugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:04:07 compute-0 nova_compute[186981]: 2025-11-22 10:04:07.732 186985 DEBUG nova.compute.manager [req-099afa2e-b415-432f-94f3-69478985f665 req-b08441c0-3c81-4734-9d94-fc98079db0f6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-vif-unplugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.542 186985 DEBUG nova.network.neutron [-] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.574 186985 INFO nova.compute.manager [-] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Took 1.08 seconds to deallocate network for instance.
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.677 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.678 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.752 186985 DEBUG nova.compute.provider_tree [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.806 186985 DEBUG nova.scheduler.client.report [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.851 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:08 compute-0 nova_compute[186981]: 2025-11-22 10:04:08.921 186985 INFO nova.scheduler.client.report [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance f541d5d7-2ed0-40b4-a4bb-46f142461bc2
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.009 186985 DEBUG oslo_concurrency.lockutils [None req-858d6643-7800-4a3d-bd9a-f9ef530da8c0 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.823 186985 DEBUG nova.compute.manager [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.823 186985 DEBUG oslo_concurrency.lockutils [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.824 186985 DEBUG oslo_concurrency.lockutils [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.824 186985 DEBUG oslo_concurrency.lockutils [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.824 186985 DEBUG nova.compute.manager [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] No waiting events found dispatching network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.824 186985 WARNING nova.compute.manager [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received unexpected event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 for instance with vm_state deleted and task_state None.
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.824 186985 DEBUG nova.compute.manager [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.825 186985 DEBUG oslo_concurrency.lockutils [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.825 186985 DEBUG oslo_concurrency.lockutils [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.825 186985 DEBUG oslo_concurrency.lockutils [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "f541d5d7-2ed0-40b4-a4bb-46f142461bc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.825 186985 DEBUG nova.compute.manager [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] No waiting events found dispatching network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.825 186985 WARNING nova.compute.manager [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received unexpected event network-vif-plugged-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 for instance with vm_state deleted and task_state None.
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.826 186985 DEBUG nova.compute.manager [req-3d52fc79-875c-4603-85a5-a70dc9a3afb0 req-0d4206cd-0654-48d1-9151-14c70ddab77e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Received event network-vif-deleted-cebbfcc7-6c98-4c36-91f4-65cbbf6e1f86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:09 compute-0 nova_compute[186981]: 2025-11-22 10:04:09.972 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:10 compute-0 nova_compute[186981]: 2025-11-22 10:04:10.557 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:04:10 compute-0 nova_compute[186981]: 2025-11-22 10:04:10.557 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquired lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:04:10 compute-0 nova_compute[186981]: 2025-11-22 10:04:10.557 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 10:04:10 compute-0 nova_compute[186981]: 2025-11-22 10:04:10.558 186985 DEBUG nova.objects.instance [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2c39aa0d-071b-45a1-9df3-aa0aadadf528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:04:12 compute-0 nova_compute[186981]: 2025-11-22 10:04:12.421 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:12 compute-0 ovn_controller[95329]: 2025-11-22T10:04:12Z|00049|binding|INFO|Releasing lport c7b7efd9-65c4-4838-801c-c0373e9998a3 from this chassis (sb_readonly=0)
Nov 22 10:04:12 compute-0 nova_compute[186981]: 2025-11-22 10:04:12.954 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.786 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updating instance_info_cache with network_info: [{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.805 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Releasing lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.806 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.807 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.807 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.808 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.808 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.809 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.810 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.829 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.830 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.830 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.831 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:04:13 compute-0 nova_compute[186981]: 2025-11-22 10:04:13.913 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.017 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.018 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.091 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.285 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.287 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5633MB free_disk=73.4343490600586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.287 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.287 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.387 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance 2c39aa0d-071b-45a1-9df3-aa0aadadf528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.387 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.388 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.423 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.440 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.468 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:04:14 compute-0 nova_compute[186981]: 2025-11-22 10:04:14.469 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.025 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.036 186985 DEBUG nova.compute.manager [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-changed-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.037 186985 DEBUG nova.compute.manager [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Refreshing instance network info cache due to event network-changed-6465f074-89d4-4e64-b119-166c8af9a08e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.037 186985 DEBUG oslo_concurrency.lockutils [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.037 186985 DEBUG oslo_concurrency.lockutils [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.037 186985 DEBUG nova.network.neutron [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Refreshing network info cache for port 6465f074-89d4-4e64-b119-166c8af9a08e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.254 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.255 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.296 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.296 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.297 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.297 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.297 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.298 186985 INFO nova.compute.manager [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Terminating instance
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.299 186985 DEBUG nova.compute.manager [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:04:15 compute-0 kernel: tap6465f074-89 (unregistering): left promiscuous mode
Nov 22 10:04:15 compute-0 NetworkManager[55425]: <info>  [1763805855.3683] device (tap6465f074-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:04:15 compute-0 ovn_controller[95329]: 2025-11-22T10:04:15Z|00050|binding|INFO|Releasing lport 6465f074-89d4-4e64-b119-166c8af9a08e from this chassis (sb_readonly=0)
Nov 22 10:04:15 compute-0 ovn_controller[95329]: 2025-11-22T10:04:15Z|00051|binding|INFO|Setting lport 6465f074-89d4-4e64-b119-166c8af9a08e down in Southbound
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.370 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:15 compute-0 ovn_controller[95329]: 2025-11-22T10:04:15Z|00052|binding|INFO|Removing iface tap6465f074-89 ovn-installed in OVS
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.372 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:15.377 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ae:57 10.100.0.7'], port_security=['fa:16:3e:f8:ae:57 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2c39aa0d-071b-45a1-9df3-aa0aadadf528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b79e302f-0fd7-4a4d-968d-0a04ab688694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab26dd7a-ff98-4f9a-83f2-1180a5ec2aae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=6465f074-89d4-4e64-b119-166c8af9a08e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:04:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:15.378 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 6465f074-89d4-4e64-b119-166c8af9a08e in datapath 8e9e0707-a3e1-46b9-90a3-9a4c8f606339 unbound from our chassis
Nov 22 10:04:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:15.378 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e9e0707-a3e1-46b9-90a3-9a4c8f606339, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:04:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:15.379 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[945dc245-4b2e-4f2c-a5cf-9495d7ccb247]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:15.379 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339 namespace which is not needed anymore
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.392 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:15 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 22 10:04:15 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 16.088s CPU time.
Nov 22 10:04:15 compute-0 systemd-machined[153303]: Machine qemu-1-instance-00000001 terminated.
Nov 22 10:04:15 compute-0 neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339[213626]: [NOTICE]   (213630) : haproxy version is 2.8.14-c23fe91
Nov 22 10:04:15 compute-0 neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339[213626]: [NOTICE]   (213630) : path to executable is /usr/sbin/haproxy
Nov 22 10:04:15 compute-0 neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339[213626]: [WARNING]  (213630) : Exiting Master process...
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.563 186985 INFO nova.virt.libvirt.driver [-] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Instance destroyed successfully.
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.563 186985 DEBUG nova.objects.instance [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 2c39aa0d-071b-45a1-9df3-aa0aadadf528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:04:15 compute-0 neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339[213626]: [ALERT]    (213630) : Current worker (213632) exited with code 143 (Terminated)
Nov 22 10:04:15 compute-0 neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339[213626]: [WARNING]  (213630) : All workers exited. Exiting... (0)
Nov 22 10:04:15 compute-0 systemd[1]: libpod-2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d.scope: Deactivated successfully.
Nov 22 10:04:15 compute-0 podman[214248]: 2025-11-22 10:04:15.575546249 +0000 UTC m=+0.102902641 container died 2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.576 186985 DEBUG nova.virt.libvirt.vif [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-274452189',display_name='tempest-TestNetworkBasicOps-server-274452189',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-274452189',id=1,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPqQNcA3mkBmrzfDPkqcaEj4rLol6lzA6hJX1xi7/BH7qavf/IxLCw6ctGHitrvd0GkDOkwEd5ZIr0N0xpJmzktofgCqz8QqwgTcYI7MEXIdckVheN6EhshMzlhT2jz8Ww==',key_name='tempest-TestNetworkBasicOps-971114335',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:03:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ps8oimsf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:03:04Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=2c39aa0d-071b-45a1-9df3-aa0aadadf528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.577 186985 DEBUG nova.network.os_vif_util [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.577 186985 DEBUG nova.network.os_vif_util [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:ae:57,bridge_name='br-int',has_traffic_filtering=True,id=6465f074-89d4-4e64-b119-166c8af9a08e,network=Network(8e9e0707-a3e1-46b9-90a3-9a4c8f606339),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6465f074-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.578 186985 DEBUG os_vif [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:ae:57,bridge_name='br-int',has_traffic_filtering=True,id=6465f074-89d4-4e64-b119-166c8af9a08e,network=Network(8e9e0707-a3e1-46b9-90a3-9a4c8f606339),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6465f074-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.579 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.580 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6465f074-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.581 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.582 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.585 186985 INFO os_vif [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:ae:57,bridge_name='br-int',has_traffic_filtering=True,id=6465f074-89d4-4e64-b119-166c8af9a08e,network=Network(8e9e0707-a3e1-46b9-90a3-9a4c8f606339),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6465f074-89')
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.586 186985 INFO nova.virt.libvirt.driver [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Deleting instance files /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528_del
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.586 186985 INFO nova.virt.libvirt.driver [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Deletion of /var/lib/nova/instances/2c39aa0d-071b-45a1-9df3-aa0aadadf528_del complete
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.628 186985 DEBUG nova.compute.manager [req-4728acb8-1eab-45eb-83d4-abde77b9bbe4 req-2270edc1-4726-43c5-b27c-ac6188f54d48 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-vif-unplugged-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.628 186985 DEBUG oslo_concurrency.lockutils [req-4728acb8-1eab-45eb-83d4-abde77b9bbe4 req-2270edc1-4726-43c5-b27c-ac6188f54d48 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.628 186985 DEBUG oslo_concurrency.lockutils [req-4728acb8-1eab-45eb-83d4-abde77b9bbe4 req-2270edc1-4726-43c5-b27c-ac6188f54d48 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.628 186985 DEBUG oslo_concurrency.lockutils [req-4728acb8-1eab-45eb-83d4-abde77b9bbe4 req-2270edc1-4726-43c5-b27c-ac6188f54d48 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.629 186985 DEBUG nova.compute.manager [req-4728acb8-1eab-45eb-83d4-abde77b9bbe4 req-2270edc1-4726-43c5-b27c-ac6188f54d48 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] No waiting events found dispatching network-vif-unplugged-6465f074-89d4-4e64-b119-166c8af9a08e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.629 186985 DEBUG nova.compute.manager [req-4728acb8-1eab-45eb-83d4-abde77b9bbe4 req-2270edc1-4726-43c5-b27c-ac6188f54d48 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-vif-unplugged-6465f074-89d4-4e64-b119-166c8af9a08e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.638 186985 INFO nova.compute.manager [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.638 186985 DEBUG oslo.service.loopingcall [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.639 186985 DEBUG nova.compute.manager [-] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:04:15 compute-0 nova_compute[186981]: 2025-11-22 10:04:15.639 186985 DEBUG nova.network.neutron [-] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:04:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d-userdata-shm.mount: Deactivated successfully.
Nov 22 10:04:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-ad5963f74a7478bdaf357daf4944f9c9788848581203ddc13f7413b3c82957ae-merged.mount: Deactivated successfully.
Nov 22 10:04:16 compute-0 podman[214248]: 2025-11-22 10:04:16.008974065 +0000 UTC m=+0.536330447 container cleanup 2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:04:16 compute-0 systemd[1]: libpod-conmon-2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d.scope: Deactivated successfully.
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.603 186985 DEBUG nova.network.neutron [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updated VIF entry in instance network info cache for port 6465f074-89d4-4e64-b119-166c8af9a08e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.603 186985 DEBUG nova.network.neutron [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updating instance_info_cache with network_info: [{"id": "6465f074-89d4-4e64-b119-166c8af9a08e", "address": "fa:16:3e:f8:ae:57", "network": {"id": "8e9e0707-a3e1-46b9-90a3-9a4c8f606339", "bridge": "br-int", "label": "tempest-network-smoke--692798732", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6465f074-89", "ovs_interfaceid": "6465f074-89d4-4e64-b119-166c8af9a08e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.609 186985 DEBUG nova.network.neutron [-] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.627 186985 INFO nova.compute.manager [-] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Took 0.99 seconds to deallocate network for instance.
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.634 186985 DEBUG oslo_concurrency.lockutils [req-1608d738-c333-4dfb-b8be-764a550eb711 req-44e6fc6b-8253-4db3-93f5-0d75066932e9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-2c39aa0d-071b-45a1-9df3-aa0aadadf528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.690 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.690 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.734 186985 DEBUG nova.compute.provider_tree [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.760 186985 DEBUG nova.scheduler.client.report [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.783 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.819 186985 INFO nova.scheduler.client.report [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 2c39aa0d-071b-45a1-9df3-aa0aadadf528
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:04:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.914 186985 DEBUG oslo_concurrency.lockutils [None req-17947ea5-71eb-4cd0-a50c-fa031bccc06a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:16 compute-0 podman[214293]: 2025-11-22 10:04:16.950233043 +0000 UTC m=+0.917419114 container remove 2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 10:04:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:16.958 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd30f9e-ac48-47db-ac31-0686eb499118]: (4, ('Sat Nov 22 10:04:15 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339 (2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d)\n2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d\nSat Nov 22 10:04:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339 (2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d)\n2b0b5fb1a69d21297601f0f7bbc40a5ae6d746d2de1e4e0398412afeeb9be55d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:16.960 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6aa204-1f2a-441e-ba2d-d448e35a1731]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:16.960 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e9e0707-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:16 compute-0 kernel: tap8e9e0707-a0: left promiscuous mode
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.962 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:16.967 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d70c42bb-a9e1-45b9-978c-8a14c57d9375]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:16 compute-0 nova_compute[186981]: 2025-11-22 10:04:16.978 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:16.986 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[57e1e24b-3e25-4c7b-aade-ff8d7a7657ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:16.987 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[269d3c9d-2d62-485f-b05e-5a4f9a824b03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:17.002 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[633eef2c-dd80-49ac-8433-fa4461de5f3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 324319, 'reachable_time': 38516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214312, 'error': None, 'target': 'ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d8e9e0707\x2da3e1\x2d46b9\x2d90a3\x2d9a4c8f606339.mount: Deactivated successfully.
Nov 22 10:04:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:17.006 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e9e0707-a3e1-46b9-90a3-9a4c8f606339 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:04:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:17.006 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe81d72-3571-4b0b-ad69-af45c8725781]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.107 186985 DEBUG nova.compute.manager [req-a83b3b7c-9230-42f9-a85a-616e24ed5266 req-0af07ddb-1fd0-4c89-a040-1834b0cc44b3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-vif-deleted-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.107 186985 INFO nova.compute.manager [req-a83b3b7c-9230-42f9-a85a-616e24ed5266 req-0af07ddb-1fd0-4c89-a040-1834b0cc44b3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Neutron deleted interface 6465f074-89d4-4e64-b119-166c8af9a08e; detaching it from the instance and deleting it from the info cache
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.108 186985 DEBUG nova.network.neutron [req-a83b3b7c-9230-42f9-a85a-616e24ed5266 req-0af07ddb-1fd0-4c89-a040-1834b0cc44b3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.111 186985 DEBUG nova.compute.manager [req-a83b3b7c-9230-42f9-a85a-616e24ed5266 req-0af07ddb-1fd0-4c89-a040-1834b0cc44b3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Detach interface failed, port_id=6465f074-89d4-4e64-b119-166c8af9a08e, reason: Instance 2c39aa0d-071b-45a1-9df3-aa0aadadf528 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.760 186985 DEBUG nova.compute.manager [req-cb7d6904-9366-48b4-aae6-d3075b55a32e req-47ffea33-2e1b-4c85-9f56-f1dd9db2ff85 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received event network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.760 186985 DEBUG oslo_concurrency.lockutils [req-cb7d6904-9366-48b4-aae6-d3075b55a32e req-47ffea33-2e1b-4c85-9f56-f1dd9db2ff85 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.761 186985 DEBUG oslo_concurrency.lockutils [req-cb7d6904-9366-48b4-aae6-d3075b55a32e req-47ffea33-2e1b-4c85-9f56-f1dd9db2ff85 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.761 186985 DEBUG oslo_concurrency.lockutils [req-cb7d6904-9366-48b4-aae6-d3075b55a32e req-47ffea33-2e1b-4c85-9f56-f1dd9db2ff85 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "2c39aa0d-071b-45a1-9df3-aa0aadadf528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.761 186985 DEBUG nova.compute.manager [req-cb7d6904-9366-48b4-aae6-d3075b55a32e req-47ffea33-2e1b-4c85-9f56-f1dd9db2ff85 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] No waiting events found dispatching network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:04:17 compute-0 nova_compute[186981]: 2025-11-22 10:04:17.761 186985 WARNING nova.compute.manager [req-cb7d6904-9366-48b4-aae6-d3075b55a32e req-47ffea33-2e1b-4c85-9f56-f1dd9db2ff85 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Received unexpected event network-vif-plugged-6465f074-89d4-4e64-b119-166c8af9a08e for instance with vm_state deleted and task_state None.
Nov 22 10:04:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:17.931 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:17.932 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:17.932 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:18 compute-0 podman[214313]: 2025-11-22 10:04:18.634012266 +0000 UTC m=+0.075216899 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:04:20 compute-0 nova_compute[186981]: 2025-11-22 10:04:20.070 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:20 compute-0 nova_compute[186981]: 2025-11-22 10:04:20.582 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:20 compute-0 nova_compute[186981]: 2025-11-22 10:04:20.981 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:21 compute-0 nova_compute[186981]: 2025-11-22 10:04:21.090 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:22 compute-0 nova_compute[186981]: 2025-11-22 10:04:22.387 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763805847.3846836, f541d5d7-2ed0-40b4-a4bb-46f142461bc2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:04:22 compute-0 nova_compute[186981]: 2025-11-22 10:04:22.387 186985 INFO nova.compute.manager [-] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] VM Stopped (Lifecycle Event)
Nov 22 10:04:22 compute-0 nova_compute[186981]: 2025-11-22 10:04:22.441 186985 DEBUG nova.compute.manager [None req-255b03b4-0861-4004-8f86-edc75082f10d - - - - - -] [instance: f541d5d7-2ed0-40b4-a4bb-46f142461bc2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:04:25 compute-0 nova_compute[186981]: 2025-11-22 10:04:25.075 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:25 compute-0 nova_compute[186981]: 2025-11-22 10:04:25.587 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:26 compute-0 podman[214338]: 2025-11-22 10:04:26.644372766 +0000 UTC m=+0.087180078 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:04:26 compute-0 podman[214339]: 2025-11-22 10:04:26.67648114 +0000 UTC m=+0.113241565 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 10:04:29 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:29.012 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:04:29 compute-0 nova_compute[186981]: 2025-11-22 10:04:29.012 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:29 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:29.014 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:04:30 compute-0 nova_compute[186981]: 2025-11-22 10:04:30.118 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:30 compute-0 nova_compute[186981]: 2025-11-22 10:04:30.560 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763805855.559284, 2c39aa0d-071b-45a1-9df3-aa0aadadf528 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:04:30 compute-0 nova_compute[186981]: 2025-11-22 10:04:30.560 186985 INFO nova.compute.manager [-] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] VM Stopped (Lifecycle Event)
Nov 22 10:04:30 compute-0 nova_compute[186981]: 2025-11-22 10:04:30.577 186985 DEBUG nova.compute.manager [None req-3c9caa6f-f514-4b3b-8cf3-eda7b13c020e - - - - - -] [instance: 2c39aa0d-071b-45a1-9df3-aa0aadadf528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:04:30 compute-0 nova_compute[186981]: 2025-11-22 10:04:30.590 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:31 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:31.017 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:32 compute-0 podman[214382]: 2025-11-22 10:04:32.633487187 +0000 UTC m=+0.080312428 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:04:32 compute-0 podman[214383]: 2025-11-22 10:04:32.653261711 +0000 UTC m=+0.091097165 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:04:35 compute-0 nova_compute[186981]: 2025-11-22 10:04:35.120 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:35 compute-0 nova_compute[186981]: 2025-11-22 10:04:35.592 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:36 compute-0 podman[214421]: 2025-11-22 10:04:36.602224081 +0000 UTC m=+0.063219490 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 10:04:36 compute-0 podman[214422]: 2025-11-22 10:04:36.634011484 +0000 UTC m=+0.082755155 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.123 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.593 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.868 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.868 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.890 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.986 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.987 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.994 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:04:40 compute-0 nova_compute[186981]: 2025-11-22 10:04:40.995 186985 INFO nova.compute.claims [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.121 186985 DEBUG nova.compute.provider_tree [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.140 186985 DEBUG nova.scheduler.client.report [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.164 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.165 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.221 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.222 186985 DEBUG nova.network.neutron [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.245 186985 INFO nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.264 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.356 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.359 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.360 186985 INFO nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Creating image(s)
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.361 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.361 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.363 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.390 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.481 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.482 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.484 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.508 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.611 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.613 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.644 186985 DEBUG nova.policy [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.672 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.673 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.674 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.744 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.745 186985 DEBUG nova.virt.disk.api [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.746 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.815 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.817 186985 DEBUG nova.virt.disk.api [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.818 186985 DEBUG nova.objects.instance [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.833 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.834 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Ensure instance console log exists: /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.834 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.835 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:41 compute-0 nova_compute[186981]: 2025-11-22 10:04:41.835 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:42 compute-0 nova_compute[186981]: 2025-11-22 10:04:42.291 186985 DEBUG nova.network.neutron [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Successfully created port: 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.038 186985 DEBUG nova.network.neutron [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Successfully updated port: 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.065 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.066 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.066 186985 DEBUG nova.network.neutron [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.280 186985 DEBUG nova.compute.manager [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-changed-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.281 186985 DEBUG nova.compute.manager [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing instance network info cache due to event network-changed-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.281 186985 DEBUG oslo_concurrency.lockutils [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:04:43 compute-0 nova_compute[186981]: 2025-11-22 10:04:43.366 186985 DEBUG nova.network.neutron [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.101 186985 DEBUG nova.network.neutron [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.139 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.140 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Instance network_info: |[{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.141 186985 DEBUG oslo_concurrency.lockutils [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.141 186985 DEBUG nova.network.neutron [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing network info cache for port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.146 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Start _get_guest_xml network_info=[{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.154 186985 WARNING nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.159 186985 DEBUG nova.virt.libvirt.host [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.159 186985 DEBUG nova.virt.libvirt.host [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.165 186985 DEBUG nova.virt.libvirt.host [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.165 186985 DEBUG nova.virt.libvirt.host [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.165 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.166 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.166 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.166 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.167 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.167 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.167 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.167 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.167 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.168 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.168 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.168 186985 DEBUG nova.virt.hardware [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.172 186985 DEBUG nova.virt.libvirt.vif [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:04:41Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.172 186985 DEBUG nova.network.os_vif_util [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.173 186985 DEBUG nova.network.os_vif_util [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:b8:16,bridge_name='br-int',has_traffic_filtering=True,id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5,network=Network(3a9237fd-d977-4a70-8d56-fb4443b7d2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77bcca4c-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.174 186985 DEBUG nova.objects.instance [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.188 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <uuid>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</uuid>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <name>instance-00000003</name>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:04:44</nova:creationTime>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:04:44 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <system>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <entry name="serial">a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <entry name="uuid">a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </system>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <os>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   </os>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <features>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   </features>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:9c:b8:16"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <target dev="tap77bcca4c-75"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log" append="off"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <video>
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </video>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:04:44 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:04:44 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:04:44 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:04:44 compute-0 nova_compute[186981]: </domain>
Nov 22 10:04:44 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.190 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Preparing to wait for external event network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.190 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.190 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.190 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.191 186985 DEBUG nova.virt.libvirt.vif [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:04:41Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.191 186985 DEBUG nova.network.os_vif_util [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.192 186985 DEBUG nova.network.os_vif_util [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:b8:16,bridge_name='br-int',has_traffic_filtering=True,id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5,network=Network(3a9237fd-d977-4a70-8d56-fb4443b7d2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77bcca4c-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.192 186985 DEBUG os_vif [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:b8:16,bridge_name='br-int',has_traffic_filtering=True,id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5,network=Network(3a9237fd-d977-4a70-8d56-fb4443b7d2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77bcca4c-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.193 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.193 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.194 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.198 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.198 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77bcca4c-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.198 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77bcca4c-75, col_values=(('external_ids', {'iface-id': '77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9c:b8:16', 'vm-uuid': 'a1a0b78c-a821-4fde-b4a4-171ae9e144a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:44 compute-0 NetworkManager[55425]: <info>  [1763805884.2005] manager: (tap77bcca4c-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.199 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.202 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.208 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.210 186985 INFO os_vif [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:b8:16,bridge_name='br-int',has_traffic_filtering=True,id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5,network=Network(3a9237fd-d977-4a70-8d56-fb4443b7d2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77bcca4c-75')
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.270 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.270 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.270 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:9c:b8:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.271 186985 INFO nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Using config drive
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.564 186985 INFO nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Creating config drive at /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.568 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5b8tu_34 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.692 186985 DEBUG oslo_concurrency.processutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5b8tu_34" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:04:44 compute-0 kernel: tap77bcca4c-75: entered promiscuous mode
Nov 22 10:04:44 compute-0 NetworkManager[55425]: <info>  [1763805884.7734] manager: (tap77bcca4c-75): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.824 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:44 compute-0 ovn_controller[95329]: 2025-11-22T10:04:44Z|00053|binding|INFO|Claiming lport 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 for this chassis.
Nov 22 10:04:44 compute-0 ovn_controller[95329]: 2025-11-22T10:04:44Z|00054|binding|INFO|77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5: Claiming fa:16:3e:9c:b8:16 10.100.0.7
Nov 22 10:04:44 compute-0 systemd-udevd[214496]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.833 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:44 compute-0 NetworkManager[55425]: <info>  [1763805884.8427] device (tap77bcca4c-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.842 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:b8:16 10.100.0.7'], port_security=['fa:16:3e:9c:b8:16 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a1a0b78c-a821-4fde-b4a4-171ae9e144a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ebf61ea0-f815-490b-895f-16ea19134e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e5bbf08-60f5-43e7-9a4d-45e4f8bd58b8, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.844 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 in datapath 3a9237fd-d977-4a70-8d56-fb4443b7d2d4 bound to our chassis
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.844 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a9237fd-d977-4a70-8d56-fb4443b7d2d4
Nov 22 10:04:44 compute-0 NetworkManager[55425]: <info>  [1763805884.8455] device (tap77bcca4c-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.854 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[81f7a416-09d0-42fc-8a94-d97d7a72a655]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.855 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a9237fd-d1 in ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.857 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a9237fd-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.857 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9a1b38-d37d-4e33-a0b3-a164e7146854]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.858 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[73a89684-42c3-4643-ac0e-d27c66c788d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.871 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[84144460-f389-491c-bff3-f7f461930460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 systemd-machined[153303]: New machine qemu-3-instance-00000003.
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.911 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[30aa8fba-0907-468a-9ad3-4127b3dda26c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Nov 22 10:04:44 compute-0 ovn_controller[95329]: 2025-11-22T10:04:44Z|00055|binding|INFO|Setting lport 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 ovn-installed in OVS
Nov 22 10:04:44 compute-0 ovn_controller[95329]: 2025-11-22T10:04:44Z|00056|binding|INFO|Setting lport 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 up in Southbound
Nov 22 10:04:44 compute-0 nova_compute[186981]: 2025-11-22 10:04:44.919 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.939 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf591e0-1fab-4aed-afe5-b56306080ef5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 NetworkManager[55425]: <info>  [1763805884.9452] manager: (tap3a9237fd-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.944 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[afe6051c-85dd-4b80-82f5-c485f47b455f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.973 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[95a49d3a-8447-4f66-a5eb-ab1dc3ebe03f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:44.975 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4ff4c4-0797-4cc3-b279-7324ce3fbb8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:44 compute-0 NetworkManager[55425]: <info>  [1763805884.9962] device (tap3a9237fd-d0): carrier: link connected
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.002 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[21f60ddb-21ed-4eff-b721-5496a3853a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.016 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[ec63172a-9561-4af8-8de7-fe46bd7a6f6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9237fd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:b3:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334270, 'reachable_time': 31022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214532, 'error': None, 'target': 'ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.035 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[db02b91c-7dc4-4f8f-aa65-eae2ddf8a8bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:b3d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334270, 'tstamp': 334270}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214533, 'error': None, 'target': 'ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.055 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[4b8707b9-7713-4133-b8e9-a32d592a5bf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9237fd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:b3:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334270, 'reachable_time': 31022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214534, 'error': None, 'target': 'ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.079 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[06af1da0-9890-423f-9f10-36d119da12c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.125 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.135 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa77d65-9dc7-4f9d-a13b-d49f3822276d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.136 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9237fd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.136 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.136 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a9237fd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:45 compute-0 kernel: tap3a9237fd-d0: entered promiscuous mode
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.138 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:45 compute-0 NetworkManager[55425]: <info>  [1763805885.1390] manager: (tap3a9237fd-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.142 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.143 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a9237fd-d0, col_values=(('external_ids', {'iface-id': 'ff45d21c-62c7-4384-8069-e5576a843cd4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.146 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a9237fd-d977-4a70-8d56-fb4443b7d2d4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a9237fd-d977-4a70-8d56-fb4443b7d2d4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:04:45 compute-0 ovn_controller[95329]: 2025-11-22T10:04:45Z|00057|binding|INFO|Releasing lport ff45d21c-62c7-4384-8069-e5576a843cd4 from this chassis (sb_readonly=0)
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.147 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[34714933-e766-4752-8ca6-8379ede2b9e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.147 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-3a9237fd-d977-4a70-8d56-fb4443b7d2d4
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/3a9237fd-d977-4a70-8d56-fb4443b7d2d4.pid.haproxy
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID 3a9237fd-d977-4a70-8d56-fb4443b7d2d4
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:04:45 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:04:45.149 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'env', 'PROCESS_TAG=haproxy-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a9237fd-d977-4a70-8d56-fb4443b7d2d4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.157 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.183 186985 DEBUG nova.network.neutron [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updated VIF entry in instance network info cache for port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.184 186985 DEBUG nova.network.neutron [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.202 186985 DEBUG oslo_concurrency.lockutils [req-840c093a-70ab-4362-8ffe-b8972b4e15f2 req-279f3f24-8b70-463d-a780-b387d0eae2e2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.355 186985 DEBUG nova.compute.manager [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.356 186985 DEBUG oslo_concurrency.lockutils [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.356 186985 DEBUG oslo_concurrency.lockutils [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.356 186985 DEBUG oslo_concurrency.lockutils [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.357 186985 DEBUG nova.compute.manager [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Processing event network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.357 186985 DEBUG nova.compute.manager [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.357 186985 DEBUG oslo_concurrency.lockutils [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.357 186985 DEBUG oslo_concurrency.lockutils [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.358 186985 DEBUG oslo_concurrency.lockutils [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.358 186985 DEBUG nova.compute.manager [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] No waiting events found dispatching network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.358 186985 WARNING nova.compute.manager [req-90e580a9-2019-4c1b-84a4-6edbfd0153a3 req-e55c9476-ac0e-48a1-af07-e5ce456b61d8 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received unexpected event network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 for instance with vm_state building and task_state spawning.
Nov 22 10:04:45 compute-0 podman[214564]: 2025-11-22 10:04:45.510366134 +0000 UTC m=+0.060846085 container create e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 10:04:45 compute-0 systemd[1]: Started libpod-conmon-e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247.scope.
Nov 22 10:04:45 compute-0 podman[214564]: 2025-11-22 10:04:45.474774615 +0000 UTC m=+0.025254656 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:04:45 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:04:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cd5c3051c976381b4edd28ff9fa7f6aae4d6c6f3978819b2fb4954cedd513c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:04:45 compute-0 podman[214564]: 2025-11-22 10:04:45.587105063 +0000 UTC m=+0.137585054 container init e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 10:04:45 compute-0 podman[214564]: 2025-11-22 10:04:45.595544105 +0000 UTC m=+0.146024076 container start e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:04:45 compute-0 neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4[214579]: [NOTICE]   (214583) : New worker (214585) forked
Nov 22 10:04:45 compute-0 neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4[214579]: [NOTICE]   (214583) : Loading success.
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.949 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.950 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805885.949338, a1a0b78c-a821-4fde-b4a4-171ae9e144a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.950 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] VM Started (Lifecycle Event)
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.956 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.960 186985 INFO nova.virt.libvirt.driver [-] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Instance spawned successfully.
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.960 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.971 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.976 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.979 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.979 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.979 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.980 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.980 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:04:45 compute-0 nova_compute[186981]: 2025-11-22 10:04:45.981 186985 DEBUG nova.virt.libvirt.driver [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.004 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.004 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805885.9530118, a1a0b78c-a821-4fde-b4a4-171ae9e144a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.004 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] VM Paused (Lifecycle Event)
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.034 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.037 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805885.9549966, a1a0b78c-a821-4fde-b4a4-171ae9e144a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.038 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] VM Resumed (Lifecycle Event)
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.055 186985 INFO nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Took 4.70 seconds to spawn the instance on the hypervisor.
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.056 186985 DEBUG nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.065 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.068 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.088 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.113 186985 INFO nova.compute.manager [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Took 5.16 seconds to build instance.
Nov 22 10:04:46 compute-0 nova_compute[186981]: 2025-11-22 10:04:46.132 186985 DEBUG oslo_concurrency.lockutils [None req-4e4369df-65ff-4120-918f-52caeaf2e6c7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:04:49 compute-0 nova_compute[186981]: 2025-11-22 10:04:49.200 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:49 compute-0 podman[214601]: 2025-11-22 10:04:49.619427685 +0000 UTC m=+0.072703440 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:04:50 compute-0 nova_compute[186981]: 2025-11-22 10:04:50.127 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:51 compute-0 NetworkManager[55425]: <info>  [1763805891.6645] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 22 10:04:51 compute-0 NetworkManager[55425]: <info>  [1763805891.6659] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 22 10:04:51 compute-0 nova_compute[186981]: 2025-11-22 10:04:51.663 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:51 compute-0 ovn_controller[95329]: 2025-11-22T10:04:51Z|00058|binding|INFO|Releasing lport ff45d21c-62c7-4384-8069-e5576a843cd4 from this chassis (sb_readonly=0)
Nov 22 10:04:51 compute-0 ovn_controller[95329]: 2025-11-22T10:04:51Z|00059|binding|INFO|Releasing lport ff45d21c-62c7-4384-8069-e5576a843cd4 from this chassis (sb_readonly=0)
Nov 22 10:04:51 compute-0 nova_compute[186981]: 2025-11-22 10:04:51.721 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:51 compute-0 nova_compute[186981]: 2025-11-22 10:04:51.730 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:52 compute-0 nova_compute[186981]: 2025-11-22 10:04:52.874 186985 DEBUG nova.compute.manager [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-changed-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:04:52 compute-0 nova_compute[186981]: 2025-11-22 10:04:52.876 186985 DEBUG nova.compute.manager [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing instance network info cache due to event network-changed-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:04:52 compute-0 nova_compute[186981]: 2025-11-22 10:04:52.877 186985 DEBUG oslo_concurrency.lockutils [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:04:52 compute-0 nova_compute[186981]: 2025-11-22 10:04:52.877 186985 DEBUG oslo_concurrency.lockutils [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:04:52 compute-0 nova_compute[186981]: 2025-11-22 10:04:52.878 186985 DEBUG nova.network.neutron [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing network info cache for port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:04:54 compute-0 nova_compute[186981]: 2025-11-22 10:04:54.202 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:55 compute-0 nova_compute[186981]: 2025-11-22 10:04:55.130 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:04:55 compute-0 nova_compute[186981]: 2025-11-22 10:04:55.582 186985 DEBUG nova.network.neutron [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updated VIF entry in instance network info cache for port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:04:55 compute-0 nova_compute[186981]: 2025-11-22 10:04:55.583 186985 DEBUG nova.network.neutron [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:04:55 compute-0 nova_compute[186981]: 2025-11-22 10:04:55.814 186985 DEBUG oslo_concurrency.lockutils [req-3b648571-1afc-42d2-9dbc-7989f290a681 req-a606cc1b-1876-41f4-afda-8f282e8054a3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:04:57 compute-0 podman[214637]: 2025-11-22 10:04:57.638172966 +0000 UTC m=+0.091867607 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 10:04:57 compute-0 podman[214638]: 2025-11-22 10:04:57.66668759 +0000 UTC m=+0.108323449 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 22 10:04:57 compute-0 ovn_controller[95329]: 2025-11-22T10:04:57Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9c:b8:16 10.100.0.7
Nov 22 10:04:57 compute-0 ovn_controller[95329]: 2025-11-22T10:04:57Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9c:b8:16 10.100.0.7
Nov 22 10:04:59 compute-0 nova_compute[186981]: 2025-11-22 10:04:59.204 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:00 compute-0 nova_compute[186981]: 2025-11-22 10:05:00.132 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:03 compute-0 podman[214681]: 2025-11-22 10:05:03.598440843 +0000 UTC m=+0.057458531 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:05:03 compute-0 podman[214682]: 2025-11-22 10:05:03.646538975 +0000 UTC m=+0.099100705 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 10:05:04 compute-0 nova_compute[186981]: 2025-11-22 10:05:04.017 186985 INFO nova.compute.manager [None req-43231681-9665-481e-874c-5cf8fbc79a96 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Get console output
Nov 22 10:05:04 compute-0 nova_compute[186981]: 2025-11-22 10:05:04.025 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:05:04 compute-0 nova_compute[186981]: 2025-11-22 10:05:04.242 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:05 compute-0 nova_compute[186981]: 2025-11-22 10:05:05.136 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:07 compute-0 nova_compute[186981]: 2025-11-22 10:05:07.493 186985 DEBUG oslo_concurrency.lockutils [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "interface-a1a0b78c-a821-4fde-b4a4-171ae9e144a9-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:07 compute-0 nova_compute[186981]: 2025-11-22 10:05:07.494 186985 DEBUG oslo_concurrency.lockutils [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-a1a0b78c-a821-4fde-b4a4-171ae9e144a9-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:07 compute-0 nova_compute[186981]: 2025-11-22 10:05:07.494 186985 DEBUG nova.objects.instance [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'flavor' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:07 compute-0 podman[214720]: 2025-11-22 10:05:07.636636266 +0000 UTC m=+0.072452953 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:05:07 compute-0 podman[214721]: 2025-11-22 10:05:07.636727648 +0000 UTC m=+0.080455172 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Nov 22 10:05:08 compute-0 nova_compute[186981]: 2025-11-22 10:05:08.681 186985 DEBUG nova.objects.instance [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:08 compute-0 nova_compute[186981]: 2025-11-22 10:05:08.704 186985 DEBUG nova.network.neutron [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.244 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.336 186985 DEBUG nova.policy [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.784 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.785 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.785 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 10:05:09 compute-0 nova_compute[186981]: 2025-11-22 10:05:09.785 186985 DEBUG nova.objects.instance [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:10 compute-0 nova_compute[186981]: 2025-11-22 10:05:10.138 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:10 compute-0 nova_compute[186981]: 2025-11-22 10:05:10.635 186985 DEBUG nova.network.neutron [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Successfully created port: d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:05:11 compute-0 nova_compute[186981]: 2025-11-22 10:05:11.310 186985 DEBUG nova.network.neutron [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Successfully updated port: d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:05:11 compute-0 nova_compute[186981]: 2025-11-22 10:05:11.333 186985 DEBUG oslo_concurrency.lockutils [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:11 compute-0 nova_compute[186981]: 2025-11-22 10:05:11.401 186985 DEBUG nova.compute.manager [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-changed-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:11 compute-0 nova_compute[186981]: 2025-11-22 10:05:11.402 186985 DEBUG nova.compute.manager [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing instance network info cache due to event network-changed-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:05:11 compute-0 nova_compute[186981]: 2025-11-22 10:05:11.402 186985 DEBUG oslo_concurrency.lockutils [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.588 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": null, "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.606 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.606 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.607 186985 DEBUG oslo_concurrency.lockutils [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.607 186985 DEBUG nova.network.neutron [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.608 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.608 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.608 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.762 186985 WARNING nova.network.neutron [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] 4b620120-faec-4acd-a5db-2dd4d52d4de4 already exists in list: networks containing: ['4b620120-faec-4acd-a5db-2dd4d52d4de4']. ignoring it
Nov 22 10:05:12 compute-0 nova_compute[186981]: 2025-11-22 10:05:12.763 186985 WARNING nova.network.neutron [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 already exists in list: port_ids containing: ['d63b0b5c-fc47-476c-b88b-d0aaf0af46c1']. ignoring it
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.246 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.640 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.640 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.640 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.641 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.847 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.903 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.904 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:14 compute-0 nova_compute[186981]: 2025-11-22 10:05:14.954 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.042 186985 DEBUG nova.network.neutron [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.088 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.089 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5617MB free_disk=73.43391036987305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.089 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.090 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.140 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.166 186985 DEBUG oslo_concurrency.lockutils [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.167 186985 DEBUG oslo_concurrency.lockutils [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.167 186985 DEBUG nova.network.neutron [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing network info cache for port d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.170 186985 DEBUG nova.virt.libvirt.vif [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:04:46Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.170 186985 DEBUG nova.network.os_vif_util [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.171 186985 DEBUG nova.network.os_vif_util [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.171 186985 DEBUG os_vif [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.172 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.172 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.172 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.175 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.175 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd63b0b5c-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.176 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd63b0b5c-fc, col_values=(('external_ids', {'iface-id': 'd63b0b5c-fc47-476c-b88b-d0aaf0af46c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:7c:27', 'vm-uuid': 'a1a0b78c-a821-4fde-b4a4-171ae9e144a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.177 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 NetworkManager[55425]: <info>  [1763805915.1789] manager: (tapd63b0b5c-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.180 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.185 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.186 186985 INFO os_vif [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc')
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.187 186985 DEBUG nova.virt.libvirt.vif [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:04:46Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.187 186985 DEBUG nova.network.os_vif_util [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.188 186985 DEBUG nova.network.os_vif_util [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.190 186985 DEBUG nova.virt.libvirt.guest [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] attach device xml: <interface type="ethernet">
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <mac address="fa:16:3e:f6:7c:27"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <model type="virtio"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <mtu size="1442"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <target dev="tapd63b0b5c-fc"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]: </interface>
Nov 22 10:05:15 compute-0 nova_compute[186981]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 22 10:05:15 compute-0 kernel: tapd63b0b5c-fc: entered promiscuous mode
Nov 22 10:05:15 compute-0 NetworkManager[55425]: <info>  [1763805915.2041] manager: (tapd63b0b5c-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Nov 22 10:05:15 compute-0 ovn_controller[95329]: 2025-11-22T10:05:15Z|00060|binding|INFO|Claiming lport d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 for this chassis.
Nov 22 10:05:15 compute-0 ovn_controller[95329]: 2025-11-22T10:05:15Z|00061|binding|INFO|d63b0b5c-fc47-476c-b88b-d0aaf0af46c1: Claiming fa:16:3e:f6:7c:27 10.100.0.20
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.205 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.208 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.221 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:7c:27 10.100.0.20'], port_security=['fa:16:3e:f6:7c:27 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'a1a0b78c-a821-4fde-b4a4-171ae9e144a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27c5a67c-dc4c-4d67-b4f1-e6a36c0e1eec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba99cac3-7b23-4d64-a684-4a4fe0879ad7, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.222 104216 INFO neutron.agent.ovn.metadata.agent [-] Port d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 in datapath 4b620120-faec-4acd-a5db-2dd4d52d4de4 bound to our chassis
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.223 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b620120-faec-4acd-a5db-2dd4d52d4de4
Nov 22 10:05:15 compute-0 systemd-udevd[214781]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.233 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8a203c4a-29d8-4707-a99e-d63d3c33ce31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.234 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b620120-f1 in ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.236 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b620120-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.236 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[573d680a-b411-4ad1-9b53-ff897825c0aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.237 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7194d2d6-c896-4d52-b9d8-66043172a419]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.243 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 ovn_controller[95329]: 2025-11-22T10:05:15Z|00062|binding|INFO|Setting lport d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 ovn-installed in OVS
Nov 22 10:05:15 compute-0 ovn_controller[95329]: 2025-11-22T10:05:15Z|00063|binding|INFO|Setting lport d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 up in Southbound
Nov 22 10:05:15 compute-0 NetworkManager[55425]: <info>  [1763805915.2460] device (tapd63b0b5c-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:05:15 compute-0 NetworkManager[55425]: <info>  [1763805915.2466] device (tapd63b0b5c-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.254 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.254 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[1abcce9d-0130-47d9-bd81-e63ef0ac34c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.255 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.256 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.280 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[165e6eec-8500-4f73-bdd3-e4aed9bbcbb8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.302 186985 DEBUG nova.virt.libvirt.driver [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.303 186985 DEBUG nova.virt.libvirt.driver [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.304 186985 DEBUG nova.virt.libvirt.driver [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:9c:b8:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.304 186985 DEBUG nova.virt.libvirt.driver [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:f6:7c:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.310 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[21b7777e-52a1-4661-a0a0-213901aae4e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.315 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1f233368-2f65-4c81-9d22-fd1fd3a12758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 NetworkManager[55425]: <info>  [1763805915.3162] manager: (tap4b620120-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.319 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.334 186985 DEBUG nova.virt.libvirt.guest [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:05:15</nova:creationTime>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:05:15 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     <nova:port uuid="d63b0b5c-fc47-476c-b88b-d0aaf0af46c1">
Nov 22 10:05:15 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Nov 22 10:05:15 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:15 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:05:15 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:05:15 compute-0 nova_compute[186981]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.336 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.345 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[bf00fb4d-38ab-4e57-950b-5b6519a8c215]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.348 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[05ff8e63-82f8-4026-ba8c-7e3592499b07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 NetworkManager[55425]: <info>  [1763805915.3712] device (tap4b620120-f0): carrier: link connected
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.371 186985 DEBUG oslo_concurrency.lockutils [None req-b1cfbfbf-1698-44fd-ab0e-61206aaec650 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-a1a0b78c-a821-4fde-b4a4-171ae9e144a9-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.374 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.375 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.374 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[236da78c-34b9-4771-8da0-34bb1656ad21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.390 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[fa17c3d3-8884-46f7-adc4-b7d814051a46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b620120-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:de:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337307, 'reachable_time': 31610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214807, 'error': None, 'target': 'ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.413 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[255ae84d-1924-4627-a1fc-d9fe89b6544d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:de2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 337307, 'tstamp': 337307}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214808, 'error': None, 'target': 'ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.428 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[b7263ba8-144c-48e3-a7f9-e1ef64a9c9c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b620120-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:de:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337307, 'reachable_time': 31610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214809, 'error': None, 'target': 'ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.459 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5c0676-66ca-43ff-9eee-0231f6a500b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.512 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[b842b63f-d2d3-4008-be43-84e08f759558]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.513 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b620120-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.513 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.514 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b620120-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:15 compute-0 NetworkManager[55425]: <info>  [1763805915.5167] manager: (tap4b620120-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 22 10:05:15 compute-0 kernel: tap4b620120-f0: entered promiscuous mode
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.516 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.518 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.519 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b620120-f0, col_values=(('external_ids', {'iface-id': 'b2eabd1d-2a26-4f04-add8-4601bef781f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:15 compute-0 ovn_controller[95329]: 2025-11-22T10:05:15Z|00064|binding|INFO|Releasing lport b2eabd1d-2a26-4f04-add8-4601bef781f0 from this chassis (sb_readonly=0)
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.520 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.537 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.537 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b620120-faec-4acd-a5db-2dd4d52d4de4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b620120-faec-4acd-a5db-2dd4d52d4de4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.538 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1360c3f6-3018-4a17-9d51-01ff2a1645cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.539 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-4b620120-faec-4acd-a5db-2dd4d52d4de4
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/4b620120-faec-4acd-a5db-2dd4d52d4de4.pid.haproxy
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID 4b620120-faec-4acd-a5db-2dd4d52d4de4
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:05:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:15.539 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'env', 'PROCESS_TAG=haproxy-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b620120-faec-4acd-a5db-2dd4d52d4de4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.829 186985 DEBUG nova.compute.manager [req-66683f19-614e-45da-963e-e4de0ea6850f req-5fa0b761-c02b-407d-912a-32cbf6eb658a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.830 186985 DEBUG oslo_concurrency.lockutils [req-66683f19-614e-45da-963e-e4de0ea6850f req-5fa0b761-c02b-407d-912a-32cbf6eb658a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.831 186985 DEBUG oslo_concurrency.lockutils [req-66683f19-614e-45da-963e-e4de0ea6850f req-5fa0b761-c02b-407d-912a-32cbf6eb658a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.832 186985 DEBUG oslo_concurrency.lockutils [req-66683f19-614e-45da-963e-e4de0ea6850f req-5fa0b761-c02b-407d-912a-32cbf6eb658a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.832 186985 DEBUG nova.compute.manager [req-66683f19-614e-45da-963e-e4de0ea6850f req-5fa0b761-c02b-407d-912a-32cbf6eb658a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] No waiting events found dispatching network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:05:15 compute-0 nova_compute[186981]: 2025-11-22 10:05:15.833 186985 WARNING nova.compute.manager [req-66683f19-614e-45da-963e-e4de0ea6850f req-5fa0b761-c02b-407d-912a-32cbf6eb658a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received unexpected event network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 for instance with vm_state active and task_state None.
Nov 22 10:05:15 compute-0 podman[214838]: 2025-11-22 10:05:15.920184996 +0000 UTC m=+0.063096355 container create 43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 10:05:15 compute-0 systemd[1]: Started libpod-conmon-43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb.scope.
Nov 22 10:05:15 compute-0 podman[214838]: 2025-11-22 10:05:15.878556611 +0000 UTC m=+0.021467990 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:05:15 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:05:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00b2c7ddabfd0518e47cf015ad9bd72c357c3f3eb02af31d804ec2281a5b3717/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:05:16 compute-0 podman[214838]: 2025-11-22 10:05:16.0127194 +0000 UTC m=+0.155630779 container init 43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 10:05:16 compute-0 podman[214838]: 2025-11-22 10:05:16.017686477 +0000 UTC m=+0.160597836 container start 43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 10:05:16 compute-0 neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4[214854]: [NOTICE]   (214858) : New worker (214860) forked
Nov 22 10:05:16 compute-0 neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4[214854]: [NOTICE]   (214858) : Loading success.
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.371 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.561 186985 DEBUG oslo_concurrency.lockutils [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "interface-a1a0b78c-a821-4fde-b4a4-171ae9e144a9-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.562 186985 DEBUG oslo_concurrency.lockutils [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-a1a0b78c-a821-4fde-b4a4-171ae9e144a9-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.579 186985 DEBUG nova.objects.instance [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'flavor' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.590 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.606 186985 DEBUG nova.virt.libvirt.vif [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:04:46Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.606 186985 DEBUG nova.network.os_vif_util [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.607 186985 DEBUG nova.network.os_vif_util [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.612 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.615 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.618 186985 DEBUG nova.virt.libvirt.driver [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Attempting to detach device tapd63b0b5c-fc from instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.619 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] detach device xml: <interface type="ethernet">
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <mac address="fa:16:3e:f6:7c:27"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <model type="virtio"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <mtu size="1442"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <target dev="tapd63b0b5c-fc"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]: </interface>
Nov 22 10:05:16 compute-0 nova_compute[186981]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.625 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.629 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <name>instance-00000003</name>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <uuid>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</uuid>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:05:15</nova:creationTime>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:port uuid="d63b0b5c-fc47-476c-b88b-d0aaf0af46c1">
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:05:16 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <system>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='serial'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='uuid'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </system>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <os>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </os>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <features>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </features>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk' index='2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config' index='1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:9c:b8:16'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target dev='tap77bcca4c-75'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:f6:7c:27'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target dev='tapd63b0b5c-fc'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='net1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       </target>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </console>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <video>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </video>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c291,c756</label>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c291,c756</imagelabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]: </domain>
Nov 22 10:05:16 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.631 186985 INFO nova.virt.libvirt.driver [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully detached device tapd63b0b5c-fc from instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9 from the persistent domain config.
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.632 186985 DEBUG nova.virt.libvirt.driver [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] (1/8): Attempting to detach device tapd63b0b5c-fc with device alias net1 from instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.633 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] detach device xml: <interface type="ethernet">
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <mac address="fa:16:3e:f6:7c:27"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <model type="virtio"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <mtu size="1442"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <target dev="tapd63b0b5c-fc"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]: </interface>
Nov 22 10:05:16 compute-0 nova_compute[186981]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 22 10:05:16 compute-0 kernel: tapd63b0b5c-fc (unregistering): left promiscuous mode
Nov 22 10:05:16 compute-0 NetworkManager[55425]: <info>  [1763805916.7046] device (tapd63b0b5c-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:05:16 compute-0 ovn_controller[95329]: 2025-11-22T10:05:16Z|00065|binding|INFO|Releasing lport d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 from this chassis (sb_readonly=0)
Nov 22 10:05:16 compute-0 ovn_controller[95329]: 2025-11-22T10:05:16Z|00066|binding|INFO|Setting lport d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 down in Southbound
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.705 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:16 compute-0 ovn_controller[95329]: 2025-11-22T10:05:16Z|00067|binding|INFO|Removing iface tapd63b0b5c-fc ovn-installed in OVS
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.708 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.713 186985 DEBUG nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Received event <DeviceRemovedEvent: 1763805916.7128198, a1a0b78c-a821-4fde-b4a4-171ae9e144a9 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.714 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:7c:27 10.100.0.20'], port_security=['fa:16:3e:f6:7c:27 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'a1a0b78c-a821-4fde-b4a4-171ae9e144a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '27c5a67c-dc4c-4d67-b4f1-e6a36c0e1eec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba99cac3-7b23-4d64-a684-4a4fe0879ad7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.715 104216 INFO neutron.agent.ovn.metadata.agent [-] Port d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 in datapath 4b620120-faec-4acd-a5db-2dd4d52d4de4 unbound from our chassis
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.715 186985 DEBUG nova.virt.libvirt.driver [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Start waiting for the detach event from libvirt for device tapd63b0b5c-fc with device alias net1 for instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.716 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.716 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b620120-faec-4acd-a5db-2dd4d52d4de4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.717 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[78f9b2e3-bfe3-4d00-9ed8-602d91d74bf1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.718 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4 namespace which is not needed anymore
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.720 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <name>instance-00000003</name>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <uuid>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</uuid>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:05:15</nova:creationTime>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:port uuid="d63b0b5c-fc47-476c-b88b-d0aaf0af46c1">
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:05:16 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <system>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='serial'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='uuid'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </system>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <os>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </os>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <features>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </features>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk' index='2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config' index='1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:9c:b8:16'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target dev='tap77bcca4c-75'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       </target>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </console>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <video>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </video>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c291,c756</label>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c291,c756</imagelabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:16 compute-0 nova_compute[186981]: </domain>
Nov 22 10:05:16 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.720 186985 INFO nova.virt.libvirt.driver [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully detached device tapd63b0b5c-fc from instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9 from the live domain config.
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.721 186985 DEBUG nova.virt.libvirt.vif [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:04:46Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.721 186985 DEBUG nova.network.os_vif_util [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.722 186985 DEBUG nova.network.os_vif_util [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.723 186985 DEBUG os_vif [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.725 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.725 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd63b0b5c-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.727 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.728 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.732 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.734 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.736 186985 INFO os_vif [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc')
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.737 186985 DEBUG nova.virt.libvirt.guest [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:05:16</nova:creationTime>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:05:16 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:05:16 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:16 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:05:16 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:05:16 compute-0 nova_compute[186981]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.809 186985 DEBUG nova.network.neutron [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updated VIF entry in instance network info cache for port d63b0b5c-fc47-476c-b88b-d0aaf0af46c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.809 186985 DEBUG nova.network.neutron [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.833 186985 DEBUG oslo_concurrency.lockutils [req-10f6e1c6-9351-4fa6-9e33-f2b08b6ae451 req-783ee875-347f-4b5a-9ddb-983b3618643f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:16 compute-0 neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4[214854]: [NOTICE]   (214858) : haproxy version is 2.8.14-c23fe91
Nov 22 10:05:16 compute-0 neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4[214854]: [NOTICE]   (214858) : path to executable is /usr/sbin/haproxy
Nov 22 10:05:16 compute-0 neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4[214854]: [WARNING]  (214858) : Exiting Master process...
Nov 22 10:05:16 compute-0 neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4[214854]: [ALERT]    (214858) : Current worker (214860) exited with code 143 (Terminated)
Nov 22 10:05:16 compute-0 neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4[214854]: [WARNING]  (214858) : All workers exited. Exiting... (0)
Nov 22 10:05:16 compute-0 systemd[1]: libpod-43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb.scope: Deactivated successfully.
Nov 22 10:05:16 compute-0 podman[214892]: 2025-11-22 10:05:16.858277686 +0000 UTC m=+0.053253415 container died 43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:05:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb-userdata-shm.mount: Deactivated successfully.
Nov 22 10:05:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-00b2c7ddabfd0518e47cf015ad9bd72c357c3f3eb02af31d804ec2281a5b3717-merged.mount: Deactivated successfully.
Nov 22 10:05:16 compute-0 podman[214892]: 2025-11-22 10:05:16.900926588 +0000 UTC m=+0.095902317 container cleanup 43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 10:05:16 compute-0 systemd[1]: libpod-conmon-43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb.scope: Deactivated successfully.
Nov 22 10:05:16 compute-0 podman[214922]: 2025-11-22 10:05:16.970241824 +0000 UTC m=+0.044537175 container remove 43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.975 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[4f086cec-2084-403f-ad7b-81a9c5ec3d5c]: (4, ('Sat Nov 22 10:05:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4 (43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb)\n43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb\nSat Nov 22 10:05:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4 (43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb)\n43e084341ff585c0a72650ad76da076de229dd32b16eba272eb6f8fb6b24b7eb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.976 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d07ce529-69bc-4dde-a50c-8e782572b211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:16.977 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b620120-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:16 compute-0 nova_compute[186981]: 2025-11-22 10:05:16.978 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:16 compute-0 kernel: tap4b620120-f0: left promiscuous mode
Nov 22 10:05:17 compute-0 nova_compute[186981]: 2025-11-22 10:05:17.004 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.006 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[20d7e02d-0a86-4aa3-b186-58179f63df3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.024 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[36b5931c-406a-4a57-8cd5-b65485bc2153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.025 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[4420868b-44ee-41ab-a6fd-6aa8805cc4a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.042 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[521d49a3-2d95-47cb-a05d-10b86ceef25b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337301, 'reachable_time': 38038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214937, 'error': None, 'target': 'ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.044 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b620120-faec-4acd-a5db-2dd4d52d4de4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.044 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[6bac266d-5f02-4d3a-95fa-48d1fcf7c8a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b620120\x2dfaec\x2d4acd\x2da5db\x2d2dd4d52d4de4.mount: Deactivated successfully.
Nov 22 10:05:17 compute-0 nova_compute[186981]: 2025-11-22 10:05:17.656 186985 DEBUG oslo_concurrency.lockutils [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:17 compute-0 nova_compute[186981]: 2025-11-22 10:05:17.656 186985 DEBUG oslo_concurrency.lockutils [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:17 compute-0 nova_compute[186981]: 2025-11-22 10:05:17.656 186985 DEBUG nova.network.neutron [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.933 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.934 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:17.935 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.107 186985 DEBUG nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.107 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.108 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.108 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.108 186985 DEBUG nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] No waiting events found dispatching network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.108 186985 WARNING nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received unexpected event network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 for instance with vm_state active and task_state None.
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.108 186985 DEBUG nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-unplugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.109 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.109 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.109 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.109 186985 DEBUG nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] No waiting events found dispatching network-vif-unplugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.109 186985 WARNING nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received unexpected event network-vif-unplugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 for instance with vm_state active and task_state None.
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.110 186985 DEBUG nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.110 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.110 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.110 186985 DEBUG oslo_concurrency.lockutils [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.110 186985 DEBUG nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] No waiting events found dispatching network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.111 186985 WARNING nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received unexpected event network-vif-plugged-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 for instance with vm_state active and task_state None.
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.111 186985 DEBUG nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-deleted-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.111 186985 INFO nova.compute.manager [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Neutron deleted interface d63b0b5c-fc47-476c-b88b-d0aaf0af46c1; detaching it from the instance and deleting it from the info cache
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.111 186985 DEBUG nova.network.neutron [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.161 186985 DEBUG nova.objects.instance [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lazy-loading 'system_metadata' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.229 186985 DEBUG nova.objects.instance [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lazy-loading 'flavor' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.464 186985 DEBUG nova.virt.libvirt.vif [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:04:46Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.465 186985 DEBUG nova.network.os_vif_util [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converting VIF {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.465 186985 DEBUG nova.network.os_vif_util [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.469 186985 DEBUG nova.virt.libvirt.guest [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.472 186985 DEBUG nova.virt.libvirt.guest [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <name>instance-00000003</name>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <uuid>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</uuid>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:05:16</nova:creationTime>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:05:18 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <system>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='serial'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='uuid'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </system>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <os>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </os>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <features>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </features>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk' index='2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config' index='1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:9c:b8:16'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target dev='tap77bcca4c-75'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       </target>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </console>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <video>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </video>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c291,c756</label>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c291,c756</imagelabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]: </domain>
Nov 22 10:05:18 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.472 186985 DEBUG nova.virt.libvirt.guest [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.476 186985 DEBUG nova.virt.libvirt.guest [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f6:7c:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd63b0b5c-fc"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <name>instance-00000003</name>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <uuid>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</uuid>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:05:16</nova:creationTime>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:05:18 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <system>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='serial'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='uuid'>a1a0b78c-a821-4fde-b4a4-171ae9e144a9</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </system>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <os>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </os>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <features>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </features>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk' index='2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/disk.config' index='1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:9c:b8:16'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target dev='tap77bcca4c-75'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       </target>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9/console.log' append='off'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </console>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </input>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <video>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </video>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c291,c756</label>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c291,c756</imagelabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:05:18 compute-0 nova_compute[186981]: </domain>
Nov 22 10:05:18 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.476 186985 WARNING nova.virt.libvirt.driver [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Detaching interface fa:16:3e:f6:7c:27 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapd63b0b5c-fc' not found.
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.477 186985 DEBUG nova.virt.libvirt.vif [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:04:46Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.477 186985 DEBUG nova.network.os_vif_util [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converting VIF {"id": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "address": "fa:16:3e:f6:7c:27", "network": {"id": "4b620120-faec-4acd-a5db-2dd4d52d4de4", "bridge": "br-int", "label": "tempest-network-smoke--1480446237", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd63b0b5c-fc", "ovs_interfaceid": "d63b0b5c-fc47-476c-b88b-d0aaf0af46c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.477 186985 DEBUG nova.network.os_vif_util [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.478 186985 DEBUG os_vif [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.479 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.479 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd63b0b5c-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.479 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.487 186985 INFO os_vif [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:7c:27,bridge_name='br-int',has_traffic_filtering=True,id=d63b0b5c-fc47-476c-b88b-d0aaf0af46c1,network=Network(4b620120-faec-4acd-a5db-2dd4d52d4de4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd63b0b5c-fc')
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.488 186985 DEBUG nova.virt.libvirt.guest [req-cc686b48-93e2-4015-a964-1566d68df5b8 req-ade076fe-224a-44b5-a767-9a7be2c9b7cc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-730068357</nova:name>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:05:18</nova:creationTime>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     <nova:port uuid="77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5">
Nov 22 10:05:18 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:05:18 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:05:18 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:05:18 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:05:18 compute-0 nova_compute[186981]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 10:05:18 compute-0 nova_compute[186981]: 2025-11-22 10:05:18.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:05:19 compute-0 ovn_controller[95329]: 2025-11-22T10:05:19Z|00068|binding|INFO|Releasing lport ff45d21c-62c7-4384-8069-e5576a843cd4 from this chassis (sb_readonly=0)
Nov 22 10:05:19 compute-0 nova_compute[186981]: 2025-11-22 10:05:19.219 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:19 compute-0 nova_compute[186981]: 2025-11-22 10:05:19.371 186985 INFO nova.network.neutron [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Port d63b0b5c-fc47-476c-b88b-d0aaf0af46c1 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 22 10:05:19 compute-0 nova_compute[186981]: 2025-11-22 10:05:19.372 186985 DEBUG nova.network.neutron [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:19 compute-0 nova_compute[186981]: 2025-11-22 10:05:19.436 186985 DEBUG oslo_concurrency.lockutils [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:19 compute-0 nova_compute[186981]: 2025-11-22 10:05:19.518 186985 DEBUG oslo_concurrency.lockutils [None req-ba237e66-7040-4996-b8e2-2441f6e9251f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-a1a0b78c-a821-4fde-b4a4-171ae9e144a9-d63b0b5c-fc47-476c-b88b-d0aaf0af46c1" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.142 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.229 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.229 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.229 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.229 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.230 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.231 186985 INFO nova.compute.manager [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Terminating instance
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.232 186985 DEBUG nova.compute.manager [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.238 186985 DEBUG nova.compute.manager [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-changed-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.238 186985 DEBUG nova.compute.manager [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing instance network info cache due to event network-changed-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.238 186985 DEBUG oslo_concurrency.lockutils [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.239 186985 DEBUG oslo_concurrency.lockutils [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.239 186985 DEBUG nova.network.neutron [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Refreshing network info cache for port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:05:20 compute-0 kernel: tap77bcca4c-75 (unregistering): left promiscuous mode
Nov 22 10:05:20 compute-0 NetworkManager[55425]: <info>  [1763805920.2618] device (tap77bcca4c-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:05:20 compute-0 ovn_controller[95329]: 2025-11-22T10:05:20Z|00069|binding|INFO|Releasing lport 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 from this chassis (sb_readonly=0)
Nov 22 10:05:20 compute-0 ovn_controller[95329]: 2025-11-22T10:05:20Z|00070|binding|INFO|Setting lport 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 down in Southbound
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.268 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 ovn_controller[95329]: 2025-11-22T10:05:20Z|00071|binding|INFO|Removing iface tap77bcca4c-75 ovn-installed in OVS
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.270 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.288 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 22 10:05:20 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 14.626s CPU time.
Nov 22 10:05:20 compute-0 systemd-machined[153303]: Machine qemu-3-instance-00000003 terminated.
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.303 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:b8:16 10.100.0.7'], port_security=['fa:16:3e:9c:b8:16 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a1a0b78c-a821-4fde-b4a4-171ae9e144a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ebf61ea0-f815-490b-895f-16ea19134e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e5bbf08-60f5-43e7-9a4d-45e4f8bd58b8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.305 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 in datapath 3a9237fd-d977-4a70-8d56-fb4443b7d2d4 unbound from our chassis
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.306 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a9237fd-d977-4a70-8d56-fb4443b7d2d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.309 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[39c367fc-cf16-475f-93e4-6462989c84b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.310 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4 namespace which is not needed anymore
Nov 22 10:05:20 compute-0 podman[214938]: 2025-11-22 10:05:20.350340525 +0000 UTC m=+0.067455195 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:05:20 compute-0 neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4[214579]: [NOTICE]   (214583) : haproxy version is 2.8.14-c23fe91
Nov 22 10:05:20 compute-0 neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4[214579]: [NOTICE]   (214583) : path to executable is /usr/sbin/haproxy
Nov 22 10:05:20 compute-0 neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4[214579]: [WARNING]  (214583) : Exiting Master process...
Nov 22 10:05:20 compute-0 neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4[214579]: [ALERT]    (214583) : Current worker (214585) exited with code 143 (Terminated)
Nov 22 10:05:20 compute-0 neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4[214579]: [WARNING]  (214583) : All workers exited. Exiting... (0)
Nov 22 10:05:20 compute-0 systemd[1]: libpod-e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247.scope: Deactivated successfully.
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.454 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 conmon[214579]: conmon e3c619387f59b9790142 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247.scope/container/memory.events
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.460 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 podman[214987]: 2025-11-22 10:05:20.462791207 +0000 UTC m=+0.056233487 container died e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.493 186985 INFO nova.virt.libvirt.driver [-] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Instance destroyed successfully.
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.494 186985 DEBUG nova.objects.instance [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid a1a0b78c-a821-4fde-b4a4-171ae9e144a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-9cd5c3051c976381b4edd28ff9fa7f6aae4d6c6f3978819b2fb4954cedd513c1-merged.mount: Deactivated successfully.
Nov 22 10:05:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247-userdata-shm.mount: Deactivated successfully.
Nov 22 10:05:20 compute-0 podman[214987]: 2025-11-22 10:05:20.501205542 +0000 UTC m=+0.094647752 container cleanup e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 10:05:20 compute-0 systemd[1]: libpod-conmon-e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247.scope: Deactivated successfully.
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.539 186985 DEBUG nova.virt.libvirt.vif [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:04:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-730068357',display_name='tempest-TestNetworkBasicOps-server-730068357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-730068357',id=3,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBImQeMov6pORoa2jXsLOdfUIEvJ85N9QHuv8PP/kFK8MHgUxxXAFiFbT0/J6eT8MbccGFQWJlHCKkoeyt0nZaVr+hTNskWqKockEWoQ8p5e86JsemO0eYAbyhFt+MUVTcg==',key_name='tempest-TestNetworkBasicOps-1506447945',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:04:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-o7mt64q5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:04:46Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=a1a0b78c-a821-4fde-b4a4-171ae9e144a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.539 186985 DEBUG nova.network.os_vif_util [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.540 186985 DEBUG nova.network.os_vif_util [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9c:b8:16,bridge_name='br-int',has_traffic_filtering=True,id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5,network=Network(3a9237fd-d977-4a70-8d56-fb4443b7d2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77bcca4c-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.541 186985 DEBUG os_vif [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:b8:16,bridge_name='br-int',has_traffic_filtering=True,id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5,network=Network(3a9237fd-d977-4a70-8d56-fb4443b7d2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77bcca4c-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.542 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.543 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77bcca4c-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.545 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.545 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.548 186985 INFO os_vif [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:b8:16,bridge_name='br-int',has_traffic_filtering=True,id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5,network=Network(3a9237fd-d977-4a70-8d56-fb4443b7d2d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77bcca4c-75')
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.548 186985 INFO nova.virt.libvirt.driver [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Deleting instance files /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9_del
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.549 186985 INFO nova.virt.libvirt.driver [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Deletion of /var/lib/nova/instances/a1a0b78c-a821-4fde-b4a4-171ae9e144a9_del complete
Nov 22 10:05:20 compute-0 podman[215032]: 2025-11-22 10:05:20.571015922 +0000 UTC m=+0.046024546 container remove e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.576 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef9ab00-89e2-4b40-b5ee-a764b44897cd]: (4, ('Sat Nov 22 10:05:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4 (e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247)\ne3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247\nSat Nov 22 10:05:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4 (e3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247)\ne3c619387f59b9790142342a940e7ff35e1dbf2a9b37b889707604e226627247\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.577 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[bc26a9b4-3b25-424c-b4ff-22f3d0b9b56b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.578 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9237fd-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.579 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 kernel: tap3a9237fd-d0: left promiscuous mode
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.595 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.597 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[cd2a15eb-7e3b-4aae-9343-df4a33bd2bea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.620 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[74e6e44c-df0d-4bff-a383-9f75b567bda9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.622 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[030f3684-5bd3-46e8-ba20-ebc88b375b2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.635 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[433de227-184c-4c9a-ba2b-e3dc19453320]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334264, 'reachable_time': 19997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215047, 'error': None, 'target': 'ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.636 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a9237fd-d977-4a70-8d56-fb4443b7d2d4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:05:20 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:20.637 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[943ca6c9-6dfc-4d20-8d62-244b7ebfcbc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d3a9237fd\x2dd977\x2d4a70\x2d8d56\x2dfb4443b7d2d4.mount: Deactivated successfully.
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.658 186985 INFO nova.compute.manager [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Took 0.43 seconds to destroy the instance on the hypervisor.
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.659 186985 DEBUG oslo.service.loopingcall [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.660 186985 DEBUG nova.compute.manager [-] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.660 186985 DEBUG nova.network.neutron [-] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.915 186985 DEBUG nova.compute.manager [req-eff45199-dcf9-4c3d-945f-fb23ae7a2c44 req-cdd8b0d8-ab3c-4044-ab61-c7a78c2748e4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-unplugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.916 186985 DEBUG oslo_concurrency.lockutils [req-eff45199-dcf9-4c3d-945f-fb23ae7a2c44 req-cdd8b0d8-ab3c-4044-ab61-c7a78c2748e4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.917 186985 DEBUG oslo_concurrency.lockutils [req-eff45199-dcf9-4c3d-945f-fb23ae7a2c44 req-cdd8b0d8-ab3c-4044-ab61-c7a78c2748e4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.917 186985 DEBUG oslo_concurrency.lockutils [req-eff45199-dcf9-4c3d-945f-fb23ae7a2c44 req-cdd8b0d8-ab3c-4044-ab61-c7a78c2748e4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.917 186985 DEBUG nova.compute.manager [req-eff45199-dcf9-4c3d-945f-fb23ae7a2c44 req-cdd8b0d8-ab3c-4044-ab61-c7a78c2748e4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] No waiting events found dispatching network-vif-unplugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:05:20 compute-0 nova_compute[186981]: 2025-11-22 10:05:20.917 186985 DEBUG nova.compute.manager [req-eff45199-dcf9-4c3d-945f-fb23ae7a2c44 req-cdd8b0d8-ab3c-4044-ab61-c7a78c2748e4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-unplugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.383 186985 DEBUG nova.network.neutron [-] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.403 186985 INFO nova.compute.manager [-] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Took 0.74 seconds to deallocate network for instance.
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.456 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.456 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.513 186985 DEBUG nova.compute.provider_tree [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.525 186985 DEBUG nova.scheduler.client.report [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.545 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.567 186985 INFO nova.scheduler.client.report [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.601 186985 DEBUG nova.network.neutron [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updated VIF entry in instance network info cache for port 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.602 186985 DEBUG nova.network.neutron [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Updating instance_info_cache with network_info: [{"id": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "address": "fa:16:3e:9c:b8:16", "network": {"id": "3a9237fd-d977-4a70-8d56-fb4443b7d2d4", "bridge": "br-int", "label": "tempest-network-smoke--1551150872", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77bcca4c-75", "ovs_interfaceid": "77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.639 186985 DEBUG oslo_concurrency.lockutils [None req-c3289424-3e16-4f2a-b13e-bc6dc1b1d4af fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:21 compute-0 nova_compute[186981]: 2025-11-22 10:05:21.640 186985 DEBUG oslo_concurrency.lockutils [req-691b26c6-0f8b-4ecb-8029-62bf0512c61e req-d9c3168f-d9e0-4462-b624-b6ee625d2f2b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-a1a0b78c-a821-4fde-b4a4-171ae9e144a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:22 compute-0 nova_compute[186981]: 2025-11-22 10:05:22.375 186985 DEBUG nova.compute.manager [req-5e36cda6-5c62-4043-bff2-4b7280504658 req-17474edd-d9db-48c0-a4b0-033853a1658e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-deleted-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:22 compute-0 nova_compute[186981]: 2025-11-22 10:05:22.376 186985 INFO nova.compute.manager [req-5e36cda6-5c62-4043-bff2-4b7280504658 req-17474edd-d9db-48c0-a4b0-033853a1658e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Neutron deleted interface 77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5; detaching it from the instance and deleting it from the info cache
Nov 22 10:05:22 compute-0 nova_compute[186981]: 2025-11-22 10:05:22.377 186985 DEBUG nova.network.neutron [req-5e36cda6-5c62-4043-bff2-4b7280504658 req-17474edd-d9db-48c0-a4b0-033853a1658e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 22 10:05:22 compute-0 nova_compute[186981]: 2025-11-22 10:05:22.380 186985 DEBUG nova.compute.manager [req-5e36cda6-5c62-4043-bff2-4b7280504658 req-17474edd-d9db-48c0-a4b0-033853a1658e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Detach interface failed, port_id=77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5, reason: Instance a1a0b78c-a821-4fde-b4a4-171ae9e144a9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 10:05:23 compute-0 nova_compute[186981]: 2025-11-22 10:05:23.010 186985 DEBUG nova.compute.manager [req-2e829034-d252-445f-9bce-24d8ea9acebc req-299d93f8-eb3b-428b-a069-0090df38d226 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received event network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:23 compute-0 nova_compute[186981]: 2025-11-22 10:05:23.011 186985 DEBUG oslo_concurrency.lockutils [req-2e829034-d252-445f-9bce-24d8ea9acebc req-299d93f8-eb3b-428b-a069-0090df38d226 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:23 compute-0 nova_compute[186981]: 2025-11-22 10:05:23.011 186985 DEBUG oslo_concurrency.lockutils [req-2e829034-d252-445f-9bce-24d8ea9acebc req-299d93f8-eb3b-428b-a069-0090df38d226 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:23 compute-0 nova_compute[186981]: 2025-11-22 10:05:23.012 186985 DEBUG oslo_concurrency.lockutils [req-2e829034-d252-445f-9bce-24d8ea9acebc req-299d93f8-eb3b-428b-a069-0090df38d226 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "a1a0b78c-a821-4fde-b4a4-171ae9e144a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:23 compute-0 nova_compute[186981]: 2025-11-22 10:05:23.012 186985 DEBUG nova.compute.manager [req-2e829034-d252-445f-9bce-24d8ea9acebc req-299d93f8-eb3b-428b-a069-0090df38d226 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] No waiting events found dispatching network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:05:23 compute-0 nova_compute[186981]: 2025-11-22 10:05:23.013 186985 WARNING nova.compute.manager [req-2e829034-d252-445f-9bce-24d8ea9acebc req-299d93f8-eb3b-428b-a069-0090df38d226 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Received unexpected event network-vif-plugged-77bcca4c-75ff-43d3-9e1a-cb9b5483b6a5 for instance with vm_state deleted and task_state None.
Nov 22 10:05:25 compute-0 nova_compute[186981]: 2025-11-22 10:05:25.170 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:25 compute-0 nova_compute[186981]: 2025-11-22 10:05:25.545 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:28 compute-0 podman[215048]: 2025-11-22 10:05:28.638367247 +0000 UTC m=+0.083676782 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 10:05:28 compute-0 nova_compute[186981]: 2025-11-22 10:05:28.707 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:28 compute-0 podman[215049]: 2025-11-22 10:05:28.711486187 +0000 UTC m=+0.154115028 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:05:28 compute-0 nova_compute[186981]: 2025-11-22 10:05:28.789 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:30 compute-0 nova_compute[186981]: 2025-11-22 10:05:30.201 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:30 compute-0 nova_compute[186981]: 2025-11-22 10:05:30.546 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:33.803 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:05:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:33.803 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:05:33 compute-0 nova_compute[186981]: 2025-11-22 10:05:33.838 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:34 compute-0 podman[215097]: 2025-11-22 10:05:34.599272412 +0000 UTC m=+0.050293774 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 10:05:34 compute-0 podman[215098]: 2025-11-22 10:05:34.642299475 +0000 UTC m=+0.085916893 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41)
Nov 22 10:05:35 compute-0 nova_compute[186981]: 2025-11-22 10:05:35.203 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:35 compute-0 nova_compute[186981]: 2025-11-22 10:05:35.492 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763805920.4904935, a1a0b78c-a821-4fde-b4a4-171ae9e144a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:05:35 compute-0 nova_compute[186981]: 2025-11-22 10:05:35.493 186985 INFO nova.compute.manager [-] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] VM Stopped (Lifecycle Event)
Nov 22 10:05:35 compute-0 nova_compute[186981]: 2025-11-22 10:05:35.511 186985 DEBUG nova.compute.manager [None req-5878f873-32f3-45c2-9f19-2fed3c22b1bf - - - - - -] [instance: a1a0b78c-a821-4fde-b4a4-171ae9e144a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:05:35 compute-0 nova_compute[186981]: 2025-11-22 10:05:35.547 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:38 compute-0 podman[215137]: 2025-11-22 10:05:38.598544877 +0000 UTC m=+0.054359856 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:05:38 compute-0 podman[215138]: 2025-11-22 10:05:38.651479821 +0000 UTC m=+0.091476905 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 10:05:40 compute-0 nova_compute[186981]: 2025-11-22 10:05:40.207 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:40 compute-0 nova_compute[186981]: 2025-11-22 10:05:40.549 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:42 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:42.806 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:43 compute-0 nova_compute[186981]: 2025-11-22 10:05:43.980 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:43 compute-0 nova_compute[186981]: 2025-11-22 10:05:43.981 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:43 compute-0 nova_compute[186981]: 2025-11-22 10:05:43.994 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.079 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.080 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.092 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.092 186985 INFO nova.compute.claims [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.252 186985 DEBUG nova.compute.provider_tree [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.271 186985 DEBUG nova.scheduler.client.report [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.292 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.293 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.359 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.360 186985 DEBUG nova.network.neutron [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.386 186985 INFO nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.415 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.514 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.516 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.517 186985 INFO nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Creating image(s)
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.518 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.519 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.520 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.543 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.624 186985 DEBUG nova.policy [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.637 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.638 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.639 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.662 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.746 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.747 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.821 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk 1073741824" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.823 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.823 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.907 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.908 186985 DEBUG nova.virt.disk.api [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.909 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.973 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.975 186985 DEBUG nova.virt.disk.api [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.976 186985 DEBUG nova.objects.instance [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 31123a76-87ae-4a5e-adb5-94bb94b3bc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.993 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.993 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Ensure instance console log exists: /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.994 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.995 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:44 compute-0 nova_compute[186981]: 2025-11-22 10:05:44.995 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:45 compute-0 nova_compute[186981]: 2025-11-22 10:05:45.211 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:45 compute-0 nova_compute[186981]: 2025-11-22 10:05:45.312 186985 DEBUG nova.network.neutron [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Successfully created port: b080d9aa-2ce8-4a11-9f13-796159a6e632 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:05:45 compute-0 nova_compute[186981]: 2025-11-22 10:05:45.551 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:46 compute-0 nova_compute[186981]: 2025-11-22 10:05:46.590 186985 DEBUG nova.network.neutron [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Successfully updated port: b080d9aa-2ce8-4a11-9f13-796159a6e632 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:05:46 compute-0 nova_compute[186981]: 2025-11-22 10:05:46.605 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:46 compute-0 nova_compute[186981]: 2025-11-22 10:05:46.606 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:46 compute-0 nova_compute[186981]: 2025-11-22 10:05:46.606 186985 DEBUG nova.network.neutron [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:05:46 compute-0 nova_compute[186981]: 2025-11-22 10:05:46.678 186985 DEBUG nova.compute.manager [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-changed-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:46 compute-0 nova_compute[186981]: 2025-11-22 10:05:46.679 186985 DEBUG nova.compute.manager [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Refreshing instance network info cache due to event network-changed-b080d9aa-2ce8-4a11-9f13-796159a6e632. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:05:46 compute-0 nova_compute[186981]: 2025-11-22 10:05:46.679 186985 DEBUG oslo_concurrency.lockutils [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:47 compute-0 nova_compute[186981]: 2025-11-22 10:05:47.584 186985 DEBUG nova.network.neutron [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.614 186985 DEBUG nova.network.neutron [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updating instance_info_cache with network_info: [{"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.640 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.640 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Instance network_info: |[{"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.641 186985 DEBUG oslo_concurrency.lockutils [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.641 186985 DEBUG nova.network.neutron [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Refreshing network info cache for port b080d9aa-2ce8-4a11-9f13-796159a6e632 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.643 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Start _get_guest_xml network_info=[{"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.648 186985 WARNING nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.653 186985 DEBUG nova.virt.libvirt.host [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.654 186985 DEBUG nova.virt.libvirt.host [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.657 186985 DEBUG nova.virt.libvirt.host [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.657 186985 DEBUG nova.virt.libvirt.host [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.658 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.658 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.658 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.658 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.659 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.659 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.659 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.659 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.660 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.660 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.660 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.660 186985 DEBUG nova.virt.hardware [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.663 186985 DEBUG nova.virt.libvirt.vif [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1697286732',display_name='tempest-TestNetworkBasicOps-server-1697286732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1697286732',id=4,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO/TraUlcHr06zIR1/fy65BadMzxpUUGQrSin66VoKaXPnvQ1h05XrAHMaJIBha2hYo4NDBtQKvRkImpRpFOYS7fh90OolTkra8lDl3ROArQjfbVVcAzy9O1QGUoVCoevQ==',key_name='tempest-TestNetworkBasicOps-1593716210',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-mynju97c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:05:44Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=31123a76-87ae-4a5e-adb5-94bb94b3bc6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.663 186985 DEBUG nova.network.os_vif_util [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.664 186985 DEBUG nova.network.os_vif_util [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:3c:fe,bridge_name='br-int',has_traffic_filtering=True,id=b080d9aa-2ce8-4a11-9f13-796159a6e632,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb080d9aa-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.665 186985 DEBUG nova.objects.instance [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 31123a76-87ae-4a5e-adb5-94bb94b3bc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.684 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <uuid>31123a76-87ae-4a5e-adb5-94bb94b3bc6f</uuid>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <name>instance-00000004</name>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-1697286732</nova:name>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:05:48</nova:creationTime>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         <nova:port uuid="b080d9aa-2ce8-4a11-9f13-796159a6e632">
Nov 22 10:05:48 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <system>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <entry name="serial">31123a76-87ae-4a5e-adb5-94bb94b3bc6f</entry>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <entry name="uuid">31123a76-87ae-4a5e-adb5-94bb94b3bc6f</entry>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </system>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <os>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   </os>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <features>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   </features>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.config"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:d4:3c:fe"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <target dev="tapb080d9aa-2c"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/console.log" append="off"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <video>
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </video>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:05:48 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:05:48 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:05:48 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:05:48 compute-0 nova_compute[186981]: </domain>
Nov 22 10:05:48 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.685 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Preparing to wait for external event network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.685 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.686 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.686 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.686 186985 DEBUG nova.virt.libvirt.vif [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1697286732',display_name='tempest-TestNetworkBasicOps-server-1697286732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1697286732',id=4,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO/TraUlcHr06zIR1/fy65BadMzxpUUGQrSin66VoKaXPnvQ1h05XrAHMaJIBha2hYo4NDBtQKvRkImpRpFOYS7fh90OolTkra8lDl3ROArQjfbVVcAzy9O1QGUoVCoevQ==',key_name='tempest-TestNetworkBasicOps-1593716210',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-mynju97c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:05:44Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=31123a76-87ae-4a5e-adb5-94bb94b3bc6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.687 186985 DEBUG nova.network.os_vif_util [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.687 186985 DEBUG nova.network.os_vif_util [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:3c:fe,bridge_name='br-int',has_traffic_filtering=True,id=b080d9aa-2ce8-4a11-9f13-796159a6e632,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb080d9aa-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.687 186985 DEBUG os_vif [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:3c:fe,bridge_name='br-int',has_traffic_filtering=True,id=b080d9aa-2ce8-4a11-9f13-796159a6e632,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb080d9aa-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.688 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.688 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.689 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.691 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.691 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb080d9aa-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.691 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb080d9aa-2c, col_values=(('external_ids', {'iface-id': 'b080d9aa-2ce8-4a11-9f13-796159a6e632', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:3c:fe', 'vm-uuid': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.754 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:48 compute-0 NetworkManager[55425]: <info>  [1763805948.7553] manager: (tapb080d9aa-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.756 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.759 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.760 186985 INFO os_vif [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:3c:fe,bridge_name='br-int',has_traffic_filtering=True,id=b080d9aa-2ce8-4a11-9f13-796159a6e632,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb080d9aa-2c')
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.869 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.869 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.869 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:d4:3c:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:05:48 compute-0 nova_compute[186981]: 2025-11-22 10:05:48.870 186985 INFO nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Using config drive
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.457 186985 INFO nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Creating config drive at /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.config
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.466 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmmg0o0gh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.607 186985 DEBUG oslo_concurrency.processutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmmg0o0gh" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:05:49 compute-0 kernel: tapb080d9aa-2c: entered promiscuous mode
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.676 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:49 compute-0 ovn_controller[95329]: 2025-11-22T10:05:49Z|00072|binding|INFO|Claiming lport b080d9aa-2ce8-4a11-9f13-796159a6e632 for this chassis.
Nov 22 10:05:49 compute-0 ovn_controller[95329]: 2025-11-22T10:05:49Z|00073|binding|INFO|b080d9aa-2ce8-4a11-9f13-796159a6e632: Claiming fa:16:3e:d4:3c:fe 10.100.0.8
Nov 22 10:05:49 compute-0 NetworkManager[55425]: <info>  [1763805949.6784] manager: (tapb080d9aa-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.680 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.683 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.686 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.712 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:3c:fe 10.100.0.8'], port_security=['fa:16:3e:d4:3c:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1f4b7f64-d796-4713-9fe1-bbaf401238e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eaf7449-26c5-4408-9e7a-284a6f6737da, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=b080d9aa-2ce8-4a11-9f13-796159a6e632) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.713 104216 INFO neutron.agent.ovn.metadata.agent [-] Port b080d9aa-2ce8-4a11-9f13-796159a6e632 in datapath f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 bound to our chassis
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.714 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f23eafbf-d3c9-4bb3-9fb9-34bdd735a136
Nov 22 10:05:49 compute-0 systemd-udevd[215214]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:05:49 compute-0 systemd-machined[153303]: New machine qemu-4-instance-00000004.
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.725 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f19c43b3-e855-435c-9528-b21eb4e1af10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.725 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf23eafbf-d1 in ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.727 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf23eafbf-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.727 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f52f5223-c018-499b-85d1-0db417a06404]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.728 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[a7273e80-81d1-4fe0-bc49-fd03dcafd89d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 NetworkManager[55425]: <info>  [1763805949.7349] device (tapb080d9aa-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:05:49 compute-0 NetworkManager[55425]: <info>  [1763805949.7363] device (tapb080d9aa-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.744 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d7fa72-1032-4b47-86c9-752271936f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.773 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[6756aa51-c11a-4207-97ec-c6fafcb86dc3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_controller[95329]: 2025-11-22T10:05:49Z|00074|binding|INFO|Setting lport b080d9aa-2ce8-4a11-9f13-796159a6e632 ovn-installed in OVS
Nov 22 10:05:49 compute-0 ovn_controller[95329]: 2025-11-22T10:05:49Z|00075|binding|INFO|Setting lport b080d9aa-2ce8-4a11-9f13-796159a6e632 up in Southbound
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.777 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:49 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.801 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[fa6d6b87-9777-43af-8a41-efdd9c74d8c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 NetworkManager[55425]: <info>  [1763805949.8079] manager: (tapf23eafbf-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.807 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2ad2118a-e26a-405a-9717-e8873dae2155]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.835 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[7f218569-0582-4645-9827-ac613d8034a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.839 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[e19dd361-56e1-4b36-86e9-f9fc2b92ca59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 NetworkManager[55425]: <info>  [1763805949.8610] device (tapf23eafbf-d0): carrier: link connected
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.866 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[c3813cc5-7493-4784-9a0b-bfbede82faa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.883 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e77d9df3-c0ae-4577-9c5b-0f698546da44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf23eafbf-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:f6:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340756, 'reachable_time': 26942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215247, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.896 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1f247587-15eb-4bc8-8c86-664576584153]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe86:f6e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340756, 'tstamp': 340756}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215248, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.910 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7ccf23-987e-422c-93a3-80c55eb16bb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf23eafbf-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:f6:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340756, 'reachable_time': 26942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215249, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:49.945 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[53d06817-6482-423a-9bd7-7df53dc34d2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.971 186985 DEBUG nova.compute.manager [req-1ebf1d03-99bb-45fb-9379-10e92e99321e req-45dcfcc3-7ea8-4c4a-ae34-97a2edeef69a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.973 186985 DEBUG oslo_concurrency.lockutils [req-1ebf1d03-99bb-45fb-9379-10e92e99321e req-45dcfcc3-7ea8-4c4a-ae34-97a2edeef69a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.974 186985 DEBUG oslo_concurrency.lockutils [req-1ebf1d03-99bb-45fb-9379-10e92e99321e req-45dcfcc3-7ea8-4c4a-ae34-97a2edeef69a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.975 186985 DEBUG oslo_concurrency.lockutils [req-1ebf1d03-99bb-45fb-9379-10e92e99321e req-45dcfcc3-7ea8-4c4a-ae34-97a2edeef69a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:49 compute-0 nova_compute[186981]: 2025-11-22 10:05:49.976 186985 DEBUG nova.compute.manager [req-1ebf1d03-99bb-45fb-9379-10e92e99321e req-45dcfcc3-7ea8-4c4a-ae34-97a2edeef69a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Processing event network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.021 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[62b1a704-1b94-4098-a712-2a4eab2e90eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.023 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf23eafbf-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.023 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.024 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf23eafbf-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.070 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:50 compute-0 kernel: tapf23eafbf-d0: entered promiscuous mode
Nov 22 10:05:50 compute-0 NetworkManager[55425]: <info>  [1763805950.0726] manager: (tapf23eafbf-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.074 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.077 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf23eafbf-d0, col_values=(('external_ids', {'iface-id': 'c7e96b5d-1547-4265-a12b-bb708976c4c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.079 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:50 compute-0 ovn_controller[95329]: 2025-11-22T10:05:50Z|00076|binding|INFO|Releasing lport c7e96b5d-1547-4265-a12b-bb708976c4c0 from this chassis (sb_readonly=0)
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.081 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.083 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f23eafbf-d3c9-4bb3-9fb9-34bdd735a136.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f23eafbf-d3c9-4bb3-9fb9-34bdd735a136.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.084 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9fcbf8-df56-49ef-adfe-b1f00ce396ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.085 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/f23eafbf-d3c9-4bb3-9fb9-34bdd735a136.pid.haproxy
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID f23eafbf-d3c9-4bb3-9fb9-34bdd735a136
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:05:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:05:50.086 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'env', 'PROCESS_TAG=haproxy-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f23eafbf-d3c9-4bb3-9fb9-34bdd735a136.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.092 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.214 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.345 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.346 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805950.3442202, 31123a76-87ae-4a5e-adb5-94bb94b3bc6f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.347 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] VM Started (Lifecycle Event)
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.352 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.357 186985 INFO nova.virt.libvirt.driver [-] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Instance spawned successfully.
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.357 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.381 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.389 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.397 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.398 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.399 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.400 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.401 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.402 186985 DEBUG nova.virt.libvirt.driver [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.408 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.410 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805950.3458776, 31123a76-87ae-4a5e-adb5-94bb94b3bc6f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.410 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] VM Paused (Lifecycle Event)
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.433 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.437 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805950.3507974, 31123a76-87ae-4a5e-adb5-94bb94b3bc6f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.437 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] VM Resumed (Lifecycle Event)
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.456 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.461 186985 INFO nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Took 5.95 seconds to spawn the instance on the hypervisor.
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.461 186985 DEBUG nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.463 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.495 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.528 186985 INFO nova.compute.manager [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Took 6.49 seconds to build instance.
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.553 186985 DEBUG oslo_concurrency.lockutils [None req-eafc1018-892e-4175-8067-dae797f718fd fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:50 compute-0 podman[215288]: 2025-11-22 10:05:50.505061963 +0000 UTC m=+0.031642551 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:05:50 compute-0 podman[215288]: 2025-11-22 10:05:50.877550564 +0000 UTC m=+0.404131132 container create f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.921 186985 DEBUG nova.network.neutron [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updated VIF entry in instance network info cache for port b080d9aa-2ce8-4a11-9f13-796159a6e632. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.923 186985 DEBUG nova.network.neutron [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updating instance_info_cache with network_info: [{"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:50 compute-0 nova_compute[186981]: 2025-11-22 10:05:50.936 186985 DEBUG oslo_concurrency.lockutils [req-186c66f6-ba94-4cd7-b811-17124af9dad5 req-8263637c-0efa-4306-a6e3-2bc53fbe7604 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:50 compute-0 podman[215300]: 2025-11-22 10:05:50.99126087 +0000 UTC m=+0.444127991 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:05:51 compute-0 systemd[1]: Started libpod-conmon-f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850.scope.
Nov 22 10:05:51 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:05:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/525336be04cadd96ba03b1cd31428e5dfb8d6be567d17ee9e434c1f741230a4c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:05:51 compute-0 podman[215288]: 2025-11-22 10:05:51.157744477 +0000 UTC m=+0.684325125 container init f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 10:05:51 compute-0 podman[215288]: 2025-11-22 10:05:51.169248024 +0000 UTC m=+0.695828622 container start f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:05:51 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [NOTICE]   (215330) : New worker (215332) forked
Nov 22 10:05:51 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [NOTICE]   (215330) : Loading success.
Nov 22 10:05:52 compute-0 nova_compute[186981]: 2025-11-22 10:05:52.058 186985 DEBUG nova.compute.manager [req-afaef58b-89af-4f07-9779-23b0f345cca5 req-a7e22f13-ac9a-450c-8d7e-c6103346d599 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:52 compute-0 nova_compute[186981]: 2025-11-22 10:05:52.059 186985 DEBUG oslo_concurrency.lockutils [req-afaef58b-89af-4f07-9779-23b0f345cca5 req-a7e22f13-ac9a-450c-8d7e-c6103346d599 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:05:52 compute-0 nova_compute[186981]: 2025-11-22 10:05:52.059 186985 DEBUG oslo_concurrency.lockutils [req-afaef58b-89af-4f07-9779-23b0f345cca5 req-a7e22f13-ac9a-450c-8d7e-c6103346d599 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:05:52 compute-0 nova_compute[186981]: 2025-11-22 10:05:52.059 186985 DEBUG oslo_concurrency.lockutils [req-afaef58b-89af-4f07-9779-23b0f345cca5 req-a7e22f13-ac9a-450c-8d7e-c6103346d599 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:05:52 compute-0 nova_compute[186981]: 2025-11-22 10:05:52.059 186985 DEBUG nova.compute.manager [req-afaef58b-89af-4f07-9779-23b0f345cca5 req-a7e22f13-ac9a-450c-8d7e-c6103346d599 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] No waiting events found dispatching network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:05:52 compute-0 nova_compute[186981]: 2025-11-22 10:05:52.059 186985 WARNING nova.compute.manager [req-afaef58b-89af-4f07-9779-23b0f345cca5 req-a7e22f13-ac9a-450c-8d7e-c6103346d599 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received unexpected event network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 for instance with vm_state active and task_state None.
Nov 22 10:05:53 compute-0 nova_compute[186981]: 2025-11-22 10:05:53.800 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:53 compute-0 ovn_controller[95329]: 2025-11-22T10:05:53Z|00077|binding|INFO|Releasing lport c7e96b5d-1547-4265-a12b-bb708976c4c0 from this chassis (sb_readonly=0)
Nov 22 10:05:53 compute-0 nova_compute[186981]: 2025-11-22 10:05:53.898 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:53 compute-0 NetworkManager[55425]: <info>  [1763805953.8994] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 22 10:05:53 compute-0 NetworkManager[55425]: <info>  [1763805953.9011] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 22 10:05:53 compute-0 ovn_controller[95329]: 2025-11-22T10:05:53Z|00078|binding|INFO|Releasing lport c7e96b5d-1547-4265-a12b-bb708976c4c0 from this chassis (sb_readonly=0)
Nov 22 10:05:53 compute-0 nova_compute[186981]: 2025-11-22 10:05:53.938 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:53 compute-0 nova_compute[186981]: 2025-11-22 10:05:53.947 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:54 compute-0 nova_compute[186981]: 2025-11-22 10:05:54.225 186985 DEBUG nova.compute.manager [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-changed-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:05:54 compute-0 nova_compute[186981]: 2025-11-22 10:05:54.225 186985 DEBUG nova.compute.manager [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Refreshing instance network info cache due to event network-changed-b080d9aa-2ce8-4a11-9f13-796159a6e632. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:05:54 compute-0 nova_compute[186981]: 2025-11-22 10:05:54.226 186985 DEBUG oslo_concurrency.lockutils [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:05:54 compute-0 nova_compute[186981]: 2025-11-22 10:05:54.226 186985 DEBUG oslo_concurrency.lockutils [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:05:54 compute-0 nova_compute[186981]: 2025-11-22 10:05:54.227 186985 DEBUG nova.network.neutron [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Refreshing network info cache for port b080d9aa-2ce8-4a11-9f13-796159a6e632 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:05:55 compute-0 nova_compute[186981]: 2025-11-22 10:05:55.217 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:55 compute-0 nova_compute[186981]: 2025-11-22 10:05:55.762 186985 DEBUG nova.network.neutron [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updated VIF entry in instance network info cache for port b080d9aa-2ce8-4a11-9f13-796159a6e632. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:05:55 compute-0 nova_compute[186981]: 2025-11-22 10:05:55.762 186985 DEBUG nova.network.neutron [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updating instance_info_cache with network_info: [{"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:05:55 compute-0 nova_compute[186981]: 2025-11-22 10:05:55.792 186985 DEBUG oslo_concurrency.lockutils [req-c6064023-20e3-4c7f-bd6f-97973f47996c req-19dbbe68-8039-4443-bdd4-25266dd003c7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:05:58 compute-0 nova_compute[186981]: 2025-11-22 10:05:58.806 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:05:59 compute-0 podman[215342]: 2025-11-22 10:05:59.676231791 +0000 UTC m=+0.127521420 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 10:05:59 compute-0 podman[215343]: 2025-11-22 10:05:59.714962133 +0000 UTC m=+0.164258297 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 10:06:00 compute-0 nova_compute[186981]: 2025-11-22 10:06:00.219 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:02 compute-0 ovn_controller[95329]: 2025-11-22T10:06:02Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:3c:fe 10.100.0.8
Nov 22 10:06:02 compute-0 ovn_controller[95329]: 2025-11-22T10:06:02Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:3c:fe 10.100.0.8
Nov 22 10:06:03 compute-0 nova_compute[186981]: 2025-11-22 10:06:03.857 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:05 compute-0 nova_compute[186981]: 2025-11-22 10:06:05.221 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:05 compute-0 podman[215402]: 2025-11-22 10:06:05.629842994 +0000 UTC m=+0.076654737 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 22 10:06:05 compute-0 podman[215403]: 2025-11-22 10:06:05.642399325 +0000 UTC m=+0.083479771 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal)
Nov 22 10:06:08 compute-0 nova_compute[186981]: 2025-11-22 10:06:08.859 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:09 compute-0 nova_compute[186981]: 2025-11-22 10:06:09.098 186985 INFO nova.compute.manager [None req-18a41781-0270-45c6-9b75-a0b1533cd919 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Get console output
Nov 22 10:06:09 compute-0 nova_compute[186981]: 2025-11-22 10:06:09.106 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:06:09 compute-0 podman[215442]: 2025-11-22 10:06:09.209177467 +0000 UTC m=+0.075577956 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 10:06:09 compute-0 podman[215443]: 2025-11-22 10:06:09.220722881 +0000 UTC m=+0.082187637 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:06:10 compute-0 nova_compute[186981]: 2025-11-22 10:06:10.224 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:11 compute-0 nova_compute[186981]: 2025-11-22 10:06:11.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:11 compute-0 nova_compute[186981]: 2025-11-22 10:06:11.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:06:11 compute-0 nova_compute[186981]: 2025-11-22 10:06:11.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:06:11 compute-0 nova_compute[186981]: 2025-11-22 10:06:11.827 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:06:11 compute-0 nova_compute[186981]: 2025-11-22 10:06:11.828 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquired lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:06:11 compute-0 nova_compute[186981]: 2025-11-22 10:06:11.828 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 10:06:11 compute-0 nova_compute[186981]: 2025-11-22 10:06:11.828 186985 DEBUG nova.objects.instance [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 31123a76-87ae-4a5e-adb5-94bb94b3bc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:06:12 compute-0 nova_compute[186981]: 2025-11-22 10:06:12.064 186985 DEBUG nova.compute.manager [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-changed-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:12 compute-0 nova_compute[186981]: 2025-11-22 10:06:12.064 186985 DEBUG nova.compute.manager [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Refreshing instance network info cache due to event network-changed-b080d9aa-2ce8-4a11-9f13-796159a6e632. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:06:12 compute-0 nova_compute[186981]: 2025-11-22 10:06:12.065 186985 DEBUG oslo_concurrency.lockutils [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:06:13 compute-0 nova_compute[186981]: 2025-11-22 10:06:13.864 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.790 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updating instance_info_cache with network_info: [{"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.812 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Releasing lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.813 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.813 186985 DEBUG oslo_concurrency.lockutils [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.813 186985 DEBUG nova.network.neutron [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Refreshing network info cache for port b080d9aa-2ce8-4a11-9f13-796159a6e632 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.814 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.815 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.815 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.815 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.815 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.837 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.838 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.838 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.838 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.912 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:14 compute-0 nova_compute[186981]: 2025-11-22 10:06:14.999 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.000 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.086 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.225 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.305 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.306 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5597MB free_disk=73.4338493347168GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.307 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.307 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.380 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance 31123a76-87ae-4a5e-adb5-94bb94b3bc6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.382 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.383 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.433 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.450 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.480 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:06:15 compute-0 nova_compute[186981]: 2025-11-22 10:06:15.480 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:16 compute-0 nova_compute[186981]: 2025-11-22 10:06:16.259 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:16 compute-0 nova_compute[186981]: 2025-11-22 10:06:16.260 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:16 compute-0 nova_compute[186981]: 2025-11-22 10:06:16.260 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:16 compute-0 nova_compute[186981]: 2025-11-22 10:06:16.374 186985 DEBUG nova.network.neutron [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updated VIF entry in instance network info cache for port b080d9aa-2ce8-4a11-9f13-796159a6e632. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:06:16 compute-0 nova_compute[186981]: 2025-11-22 10:06:16.374 186985 DEBUG nova.network.neutron [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updating instance_info_cache with network_info: [{"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:06:16 compute-0 nova_compute[186981]: 2025-11-22 10:06:16.389 186985 DEBUG oslo_concurrency.lockutils [req-f9a88cba-b44f-4b27-b360-d4bd2419cc8a req-ec241625-4bd2-4401-aa3e-fc2d392687b9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-31123a76-87ae-4a5e-adb5-94bb94b3bc6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.157 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}2750b0ab6595ee67968c760aafe0a36a61c9708a149ee10a8ebbceaef55dd468" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.273 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 22 Nov 2025 10:06:17 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-511a2a90-b4d7-447b-af2f-5daf519e8f2d x-openstack-request-id: req-511a2a90-b4d7-447b-af2f-5daf519e8f2d _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.273 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "b1ae632e-4cf1-4552-835d-a183c94ebdfc", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/b1ae632e-4cf1-4552-835d-a183c94ebdfc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/b1ae632e-4cf1-4552-835d-a183c94ebdfc"}]}, {"id": "fc9b5f29-3964-4e1f-9071-35ccb94010a9", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/fc9b5f29-3964-4e1f-9071-35ccb94010a9"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/fc9b5f29-3964-4e1f-9071-35ccb94010a9"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.273 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-511a2a90-b4d7-447b-af2f-5daf519e8f2d request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.276 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/b1ae632e-4cf1-4552-835d-a183c94ebdfc -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}2750b0ab6595ee67968c760aafe0a36a61c9708a149ee10a8ebbceaef55dd468" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.347 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 22 Nov 2025 10:06:17 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-958d433d-12dd-4709-973c-ab2f22acace2 x-openstack-request-id: req-958d433d-12dd-4709-973c-ab2f22acace2 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.347 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "b1ae632e-4cf1-4552-835d-a183c94ebdfc", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/b1ae632e-4cf1-4552-835d-a183c94ebdfc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/b1ae632e-4cf1-4552-835d-a183c94ebdfc"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.348 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/b1ae632e-4cf1-4552-835d-a183c94ebdfc used request id req-958d433d-12dd-4709-973c-ab2f22acace2 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.349 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'name': 'tempest-TestNetworkBasicOps-server-1697286732', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'user_id': 'fd88a700663e44618f0a22f234573806', 'hostId': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.349 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.380 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.write.latency volume: 2689978729 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.381 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1a59425-4e1f-4f29-a61b-3494e03ca186', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2689978729, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.350011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2d99578-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': 'afde0afaf1d74bad9a9fbd8407f098b46ce17880ea53fd33247f3a588df49a99'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.350011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2d9b17a-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': 'ddb7725ac2710e185f99d031f460514b4d9f5cdfd79ca250eee77542ee696dc8'}]}, 'timestamp': '2025-11-22 10:06:17.381989', '_unique_id': '6a7e36b996614a67848dcf3a1c7cc029'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.391 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.396 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.401 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 31123a76-87ae-4a5e-adb5-94bb94b3bc6f / tapb080d9aa-2c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.402 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8042d5b4-f1df-41a8-9614-e827268e754c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.397029', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2dcd81e-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': '1b9eced893fed813308ad460390f7dbaf09f4fdc1e0d60c8aed33dfd9ac9126c'}]}, 'timestamp': '2025-11-22 10:06:17.402745', '_unique_id': '4fa8a52c67e34edf8ebc50deefc7df28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.404 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.405 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.406 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7b33bca-626a-4e83-87e4-1121eac596b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.405602', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2dd60f4-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': '4cf8f2f2f34fb84c2786278b67b5683f8d53ab8c27f8dc0832e1b66616b1cc63'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.405602', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2dd78be-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': '24a68ec325030ce6935d8556356b0511507af22d1bcd2a6bac2e7be5a5b5bc90'}]}, 'timestamp': '2025-11-22 10:06:17.406763', '_unique_id': '944216e0ff5f47c99df6fec6b6b45e83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.407 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.409 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.outgoing.bytes volume: 16018 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '814903c4-72e0-490f-bf6f-55ce4f8d0bca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16018, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.409292', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2ddf0d2-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': 'f4a4fc7e1a173c699f120ad19600844c1f206801c1ed43a10ac9f44336e3bbdc'}]}, 'timestamp': '2025-11-22 10:06:17.409849', '_unique_id': '905a3f41fc7e4defba0f0ba527b9ad08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.410 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.412 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.incoming.bytes volume: 19156 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1671940-555e-4850-a181-e1e83700b1f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19156, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.412340', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2de69ea-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': '248eba1fdc656d4cf700af9596569e4bf0be20304bc8b0a5f87398cd96e391a3'}]}, 'timestamp': '2025-11-22 10:06:17.412912', '_unique_id': '77b9e08469404dcb8b5ca2379c11dcea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.413 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.415 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.439 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/cpu volume: 12110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9361c160-e4c4-48e7-ad99-e1862a14516d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12110000000, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'timestamp': '2025-11-22T10:06:17.415736', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e2e28d40-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.208447584, 'message_signature': 'e9796c651dc36ff3623d159d4c2d4247f228b72d78d06c3b2df0589dc180cb7d'}]}, 'timestamp': '2025-11-22 10:06:17.440164', '_unique_id': '840ec34f06814e3eb3f5dfd6b5558ec7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.441 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.443 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.458 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.459 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecf9888c-8ffb-46a7-8785-22397c6f4a88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.443753', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2e57802-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.21340547, 'message_signature': '3fbf974629ff855392f04d26ecb68cb7a62e12a0b204f238baf12ea6dbe85ba1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.443753', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2e58e00-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.21340547, 'message_signature': 'bb9ec6fbe4b708f1f0eb51d353652fb9eacb5fbc45644c29b9cb851467c77d1a'}]}, 'timestamp': '2025-11-22 10:06:17.459739', '_unique_id': '5a1a3bb032094262b7af9ede1a549345'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.461 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.462 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.462 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.read.requests volume: 1141 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.463 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '585bd6be-b430-41e4-9192-9831f8a8dc09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1141, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.462897', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2e620ea-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': 'f434f89786f11754b8452b8cfbac7f4686377f55c44c9095e9c45edfb20de2b1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.462897', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2e636a2-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': '47a099520f7c5eaafc18ef141516151716d1e701307eaa4316169ec6f90a1be0'}]}, 'timestamp': '2025-11-22 10:06:17.463994', '_unique_id': '4e2e449e08f940ccbd325c87b9f459c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.465 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.466 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.466 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>]
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.467 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.467 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.read.latency volume: 1192072893 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.468 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.read.latency volume: 143480104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '068c441c-70e2-4e5f-b7cb-8a791430f5e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1192072893, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.467833', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2e6df94-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': '570c3ce30183dfd687f89c25ea59fede910d29270e06a36eadfeeb0823b73e86'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 143480104, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.467833', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2e6f4ca-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': '608a2e3d19ed87cf0a10e030f794ab8097d9dd833965619f5d48242c9b0c8007'}]}, 'timestamp': '2025-11-22 10:06:17.468858', '_unique_id': 'b9f74cb4b6e545d888ad9b054c9a8f13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.469 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.471 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.471 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fe8d91c-fc6e-4dc7-8bd0-cc3b6f5ae4f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.471391', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2e76cca-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': '95e0c8dd7983211207abe5c6cc59d8c08217d5ab79227b7d993b537d0bba6793'}]}, 'timestamp': '2025-11-22 10:06:17.471957', '_unique_id': '9d0515ff5f91483a975b9bdb23552bea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.473 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.474 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.474 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.474 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>]
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.475 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.475 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c95b285-0c00-47fe-b7f9-e726fd7590da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.475172', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2e80112-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': 'ac437b9fd69fa61b37d19ebff57dc3407d2aaad20a82266ebd250881055fce76'}]}, 'timestamp': '2025-11-22 10:06:17.475797', '_unique_id': '4c42cb6768064c4cbc312bb1e0785151'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.476 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.478 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.478 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.478 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>]
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.478 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.479 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.incoming.packets volume: 104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47f574e3-cf29-4072-9cb3-bd541a367943', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 104, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.479131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2e89b9a-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': '451e36d4da11b02f2ec8f1736f29c2092d2c201dcc9d6b4c57ced53f99b5d313'}]}, 'timestamp': '2025-11-22 10:06:17.479834', '_unique_id': 'b7bb2854578f41948df642ac10575cdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.480 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.482 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.482 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.write.bytes volume: 72962048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.482 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '664ebdef-35ab-437c-b490-d8e5ad9d7792', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72962048, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.482307', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2e918fe-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': 'c37de914c04cd2269bb4862c95161edafd6695ef3b3ff0bbc92da355494b0480'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.482307', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2e92b32-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': 'a649ed6ab4163ca6381b1979612e338ad28fc23d448037e712756bc8796186bc'}]}, 'timestamp': '2025-11-22 10:06:17.483349', '_unique_id': '293b7779ed394173868482a0ee1f2adc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.484 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.485 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2baf168c-6818-4fe9-a915-8267db19108f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.485931', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2e9a4fe-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': '3086a11eb6a19dd8e663e9a6c65c68a524a7156d24832a3c53c88d9e18e86093'}]}, 'timestamp': '2025-11-22 10:06:17.486607', '_unique_id': '5fb07cbf8d114d94a10b062c12e6032c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.487 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.489 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8473bf7e-a910-4175-af60-41024f50f04f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.489775', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2ea3b8a-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': 'd4f8b7535a8813907666eb87321f078d3dee7fb0bb46b79135694886fd01efc3'}]}, 'timestamp': '2025-11-22 10:06:17.490442', '_unique_id': 'fd87fc619ef647f89932ca1c8cd09335'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.491 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.493 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.493 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.493 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cae6ddf4-dd1e-40f6-aecd-478bcf066f2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.493240', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2eabde4-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.21340547, 'message_signature': 'b11953c50b55795b655bd4e5e09576ff30ab40eaa4067240c5546d18d0b28636'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.493240', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2eace42-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.21340547, 'message_signature': '0b38ad420c45eeef112a2c9956da5dd8c4ea5b12648c5a56251f6db53f22e3c4'}]}, 'timestamp': '2025-11-22 10:06:17.494063', '_unique_id': 'f1cef2ecbcfd49498c439db48b3e04d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.494 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.495 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.496 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/memory.usage volume: 42.80078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6d8bbcb-f808-4b9c-83d9-8bde5832d2ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.80078125, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'timestamp': '2025-11-22T10:06:17.496084', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e2eb2d92-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.208447584, 'message_signature': 'b3c77b0ed0fc6cd971fbbdb9e04bc7e991f046acd34aec593916481524262bd8'}]}, 'timestamp': '2025-11-22 10:06:17.496499', '_unique_id': '2057442ea35a4a93b717309294fdd4ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.497 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.498 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.498 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbf7cbba-c183-4aa8-afac-8b8799ea2231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.498379', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2eb8710-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.21340547, 'message_signature': '94533d9eaecb2829cefb784836e4c3afaf22460926cd3c509cb0faa472e9f29a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.498379', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2eb9516-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.21340547, 'message_signature': '4c13118ab8639539886651ce264bd0091976233dfcf8d1588aebc268459622ef'}]}, 'timestamp': '2025-11-22 10:06:17.499138', '_unique_id': '01df5664b912429fa11f799dbab5954a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.499 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.501 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.outgoing.packets volume: 109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cb54515-0e93-4eac-a38c-c130928fae3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 109, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.501089', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2ebf010-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': '08823cd9fac7f31b50dbd45e91639698f1d02dd6c819f63a8bfa81432b5fdc60'}]}, 'timestamp': '2025-11-22 10:06:17.501492', '_unique_id': '915b080b8e074c6dbc4945c44e040103'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.502 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.503 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.503 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.503 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1697286732>]
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.503 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.503 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.read.bytes volume: 31091200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.504 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38b6c952-d584-4282-b58b-30dea27642b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31091200, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-vda', 'timestamp': '2025-11-22T10:06:17.503895', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2ec5d3e-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': 'f82e21a721301c7c209d3a20ba37c605cc88b13d212fa540bc7f976a66e4c775'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f-sda', 'timestamp': '2025-11-22T10:06:17.503895', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'instance-00000004', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2ec6cc0-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.119712362, 'message_signature': 'd6df7b374a31060c36c8f1a6b3708320b06daad4b6681461a23e68c7f6b01879'}]}, 'timestamp': '2025-11-22 10:06:17.504692', '_unique_id': '6c3ca2a5d7ab4624bb53ffcb2c608128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.505 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.506 12 DEBUG ceilometer.compute.pollsters [-] 31123a76-87ae-4a5e-adb5-94bb94b3bc6f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cdcbdc3-9889-4e6d-9694-1445b45617ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000004-31123a76-87ae-4a5e-adb5-94bb94b3bc6f-tapb080d9aa-2c', 'timestamp': '2025-11-22T10:06:17.506801', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1697286732', 'name': 'tapb080d9aa-2c', 'instance_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:3c:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb080d9aa-2c'}, 'message_id': 'e2eccf12-c78a-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3435.16674079, 'message_signature': '9c58b2a8334d9d97717a03e772ba8f40e4d8bb02c8c851ec1ed87a0874330098'}]}, 'timestamp': '2025-11-22 10:06:17.507198', '_unique_id': '87a82a05e5dd4a1e8b2922430e1dd909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:06:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:06:17.507 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:06:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:17.934 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:17.935 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:17.936 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:18 compute-0 nova_compute[186981]: 2025-11-22 10:06:18.867 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:20 compute-0 nova_compute[186981]: 2025-11-22 10:06:20.226 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:20 compute-0 nova_compute[186981]: 2025-11-22 10:06:20.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:06:20 compute-0 nova_compute[186981]: 2025-11-22 10:06:20.932 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:20 compute-0 nova_compute[186981]: 2025-11-22 10:06:20.932 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:20 compute-0 nova_compute[186981]: 2025-11-22 10:06:20.950 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.014 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.015 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.022 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.022 186985 INFO nova.compute.claims [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.149 186985 DEBUG nova.compute.provider_tree [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.164 186985 DEBUG nova.scheduler.client.report [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.190 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.192 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.235 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.235 186985 DEBUG nova.network.neutron [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.252 186985 INFO nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.276 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.380 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.382 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.383 186985 INFO nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Creating image(s)
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.384 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.384 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.386 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.410 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.437 186985 DEBUG nova.policy [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.498 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.499 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.500 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.513 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.580 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.581 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:21 compute-0 podman[215499]: 2025-11-22 10:06:21.640677991 +0000 UTC m=+0.086693858 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.729 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk 1073741824" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.730 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.731 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.817 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.818 186985 DEBUG nova.virt.disk.api [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.818 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.883 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.884 186985 DEBUG nova.virt.disk.api [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.885 186985 DEBUG nova.objects.instance [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 47ab2247-80c1-4a5a-ac41-b93c94e53ab7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.899 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.899 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Ensure instance console log exists: /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.900 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.900 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:21 compute-0 nova_compute[186981]: 2025-11-22 10:06:21.900 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:22 compute-0 nova_compute[186981]: 2025-11-22 10:06:22.753 186985 DEBUG nova.network.neutron [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Successfully created port: 13437b40-3ed5-4ffe-8109-3d02908c4aa1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:06:23 compute-0 nova_compute[186981]: 2025-11-22 10:06:23.872 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.573 186985 DEBUG nova.network.neutron [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Successfully updated port: 13437b40-3ed5-4ffe-8109-3d02908c4aa1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.599 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.599 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.600 186985 DEBUG nova.network.neutron [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.679 186985 DEBUG nova.compute.manager [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-changed-13437b40-3ed5-4ffe-8109-3d02908c4aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.680 186985 DEBUG nova.compute.manager [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Refreshing instance network info cache due to event network-changed-13437b40-3ed5-4ffe-8109-3d02908c4aa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.681 186985 DEBUG oslo_concurrency.lockutils [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:06:24 compute-0 nova_compute[186981]: 2025-11-22 10:06:24.718 186985 DEBUG nova.network.neutron [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.227 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.732 186985 DEBUG nova.network.neutron [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Updating instance_info_cache with network_info: [{"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.843 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.843 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Instance network_info: |[{"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.844 186985 DEBUG oslo_concurrency.lockutils [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.844 186985 DEBUG nova.network.neutron [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Refreshing network info cache for port 13437b40-3ed5-4ffe-8109-3d02908c4aa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.848 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Start _get_guest_xml network_info=[{"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.853 186985 WARNING nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.858 186985 DEBUG nova.virt.libvirt.host [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.859 186985 DEBUG nova.virt.libvirt.host [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.865 186985 DEBUG nova.virt.libvirt.host [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.865 186985 DEBUG nova.virt.libvirt.host [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.866 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.866 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.867 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.867 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.867 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.868 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.868 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.868 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.868 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.869 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.869 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.869 186985 DEBUG nova.virt.hardware [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.873 186985 DEBUG nova.virt.libvirt.vif [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1975310769',display_name='tempest-TestNetworkBasicOps-server-1975310769',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1975310769',id=5,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5XBGAgnB/b5ki+q3pUCkZej9h4dyoq9LnrHwCD9fgnuhg8QvNU/A/aOIyxacI+Hp16JPzSSOVCOrLz+ew4MgvtnZZhpV+1cCPdOIyR3zWJalxYpF7WfRl6IM0GTg+NmA==',key_name='tempest-TestNetworkBasicOps-2071597708',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-2hrszsb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:06:21Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=47ab2247-80c1-4a5a-ac41-b93c94e53ab7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.874 186985 DEBUG nova.network.os_vif_util [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.875 186985 DEBUG nova.network.os_vif_util [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:dc:20,bridge_name='br-int',has_traffic_filtering=True,id=13437b40-3ed5-4ffe-8109-3d02908c4aa1,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13437b40-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.876 186985 DEBUG nova.objects.instance [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 47ab2247-80c1-4a5a-ac41-b93c94e53ab7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.891 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <uuid>47ab2247-80c1-4a5a-ac41-b93c94e53ab7</uuid>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <name>instance-00000005</name>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-1975310769</nova:name>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:06:25</nova:creationTime>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         <nova:port uuid="13437b40-3ed5-4ffe-8109-3d02908c4aa1">
Nov 22 10:06:25 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <system>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <entry name="serial">47ab2247-80c1-4a5a-ac41-b93c94e53ab7</entry>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <entry name="uuid">47ab2247-80c1-4a5a-ac41-b93c94e53ab7</entry>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </system>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <os>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   </os>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <features>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   </features>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk.config"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:d4:dc:20"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <target dev="tap13437b40-3e"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/console.log" append="off"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <video>
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </video>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:06:25 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:06:25 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:06:25 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:06:25 compute-0 nova_compute[186981]: </domain>
Nov 22 10:06:25 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.892 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Preparing to wait for external event network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.893 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.893 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.893 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.894 186985 DEBUG nova.virt.libvirt.vif [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1975310769',display_name='tempest-TestNetworkBasicOps-server-1975310769',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1975310769',id=5,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5XBGAgnB/b5ki+q3pUCkZej9h4dyoq9LnrHwCD9fgnuhg8QvNU/A/aOIyxacI+Hp16JPzSSOVCOrLz+ew4MgvtnZZhpV+1cCPdOIyR3zWJalxYpF7WfRl6IM0GTg+NmA==',key_name='tempest-TestNetworkBasicOps-2071597708',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-2hrszsb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:06:21Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=47ab2247-80c1-4a5a-ac41-b93c94e53ab7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.894 186985 DEBUG nova.network.os_vif_util [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.895 186985 DEBUG nova.network.os_vif_util [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:dc:20,bridge_name='br-int',has_traffic_filtering=True,id=13437b40-3ed5-4ffe-8109-3d02908c4aa1,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13437b40-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.895 186985 DEBUG os_vif [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:dc:20,bridge_name='br-int',has_traffic_filtering=True,id=13437b40-3ed5-4ffe-8109-3d02908c4aa1,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13437b40-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.896 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.896 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.897 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.900 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.900 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13437b40-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.901 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13437b40-3e, col_values=(('external_ids', {'iface-id': '13437b40-3ed5-4ffe-8109-3d02908c4aa1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:dc:20', 'vm-uuid': '47ab2247-80c1-4a5a-ac41-b93c94e53ab7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.902 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:25 compute-0 NetworkManager[55425]: <info>  [1763805985.9035] manager: (tap13437b40-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.905 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.912 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.914 186985 INFO os_vif [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:dc:20,bridge_name='br-int',has_traffic_filtering=True,id=13437b40-3ed5-4ffe-8109-3d02908c4aa1,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13437b40-3e')
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.962 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.963 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.963 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:d4:dc:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:06:25 compute-0 nova_compute[186981]: 2025-11-22 10:06:25.964 186985 INFO nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Using config drive
Nov 22 10:06:26 compute-0 nova_compute[186981]: 2025-11-22 10:06:26.613 186985 INFO nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Creating config drive at /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk.config
Nov 22 10:06:26 compute-0 nova_compute[186981]: 2025-11-22 10:06:26.624 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4mvb0mxz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:06:26 compute-0 nova_compute[186981]: 2025-11-22 10:06:26.766 186985 DEBUG oslo_concurrency.processutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4mvb0mxz" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:06:26 compute-0 kernel: tap13437b40-3e: entered promiscuous mode
Nov 22 10:06:26 compute-0 NetworkManager[55425]: <info>  [1763805986.8563] manager: (tap13437b40-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Nov 22 10:06:26 compute-0 ovn_controller[95329]: 2025-11-22T10:06:26Z|00079|binding|INFO|Claiming lport 13437b40-3ed5-4ffe-8109-3d02908c4aa1 for this chassis.
Nov 22 10:06:26 compute-0 nova_compute[186981]: 2025-11-22 10:06:26.861 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:26 compute-0 ovn_controller[95329]: 2025-11-22T10:06:26Z|00080|binding|INFO|13437b40-3ed5-4ffe-8109-3d02908c4aa1: Claiming fa:16:3e:d4:dc:20 10.100.0.14
Nov 22 10:06:26 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:26.873 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:dc:20 10.100.0.14'], port_security=['fa:16:3e:d4:dc:20 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '47ab2247-80c1-4a5a-ac41-b93c94e53ab7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f275041e-3760-4f44-8319-d9af90fb73f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eaf7449-26c5-4408-9e7a-284a6f6737da, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=13437b40-3ed5-4ffe-8109-3d02908c4aa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:06:26 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:26.875 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 13437b40-3ed5-4ffe-8109-3d02908c4aa1 in datapath f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 bound to our chassis
Nov 22 10:06:26 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:26.877 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f23eafbf-d3c9-4bb3-9fb9-34bdd735a136
Nov 22 10:06:26 compute-0 ovn_controller[95329]: 2025-11-22T10:06:26Z|00081|binding|INFO|Setting lport 13437b40-3ed5-4ffe-8109-3d02908c4aa1 ovn-installed in OVS
Nov 22 10:06:26 compute-0 ovn_controller[95329]: 2025-11-22T10:06:26Z|00082|binding|INFO|Setting lport 13437b40-3ed5-4ffe-8109-3d02908c4aa1 up in Southbound
Nov 22 10:06:26 compute-0 nova_compute[186981]: 2025-11-22 10:06:26.889 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:26 compute-0 nova_compute[186981]: 2025-11-22 10:06:26.893 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:26 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:26.904 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[b95028bd-fca6-469f-bc35-dfe9091d3018]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:26 compute-0 systemd-udevd[215553]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:06:26 compute-0 systemd-machined[153303]: New machine qemu-5-instance-00000005.
Nov 22 10:06:26 compute-0 NetworkManager[55425]: <info>  [1763805986.9380] device (tap13437b40-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:06:26 compute-0 NetworkManager[55425]: <info>  [1763805986.9392] device (tap13437b40-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:06:26 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Nov 22 10:06:26 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:26.952 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[094747ac-90d8-4a24-840b-d5660f370b9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:26 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:26.958 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff43e13-62b8-48cf-befb-796ad9509633]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:26 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:26.994 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[62eedd08-1e0c-4938-95d5-23f831ee0b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:27.018 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d79b6ccd-ec36-4426-a92c-1930014c554c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf23eafbf-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:f6:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340756, 'reachable_time': 26942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215566, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:27.036 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[28ba69f0-fc65-4347-8db9-cd553dd9fb5d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf23eafbf-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340768, 'tstamp': 340768}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215567, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf23eafbf-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340772, 'tstamp': 340772}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215567, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:27.038 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf23eafbf-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.039 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:27.041 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf23eafbf-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:27.041 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:06:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:27.042 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf23eafbf-d0, col_values=(('external_ids', {'iface-id': 'c7e96b5d-1547-4265-a12b-bb708976c4c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:27 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:27.042 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.204 186985 DEBUG nova.network.neutron [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Updated VIF entry in instance network info cache for port 13437b40-3ed5-4ffe-8109-3d02908c4aa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.205 186985 DEBUG nova.network.neutron [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Updating instance_info_cache with network_info: [{"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.819 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805987.818748, 47ab2247-80c1-4a5a-ac41-b93c94e53ab7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.820 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] VM Started (Lifecycle Event)
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.920 186985 DEBUG nova.compute.manager [req-df0b2a6a-4e29-4b92-ace6-bb80429832c3 req-899864c8-b32e-4490-84d0-c55fad83581f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.921 186985 DEBUG oslo_concurrency.lockutils [req-df0b2a6a-4e29-4b92-ace6-bb80429832c3 req-899864c8-b32e-4490-84d0-c55fad83581f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.921 186985 DEBUG oslo_concurrency.lockutils [req-df0b2a6a-4e29-4b92-ace6-bb80429832c3 req-899864c8-b32e-4490-84d0-c55fad83581f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.922 186985 DEBUG oslo_concurrency.lockutils [req-df0b2a6a-4e29-4b92-ace6-bb80429832c3 req-899864c8-b32e-4490-84d0-c55fad83581f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.922 186985 DEBUG nova.compute.manager [req-df0b2a6a-4e29-4b92-ace6-bb80429832c3 req-899864c8-b32e-4490-84d0-c55fad83581f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Processing event network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.924 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.929 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.935 186985 INFO nova.virt.libvirt.driver [-] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Instance spawned successfully.
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.935 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:06:27 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.996 186985 DEBUG oslo_concurrency.lockutils [req-aa8aec31-f2c6-4705-931a-700700eb227c req-9d608b06-fad9-49d7-a80b-a305920be504 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:27.999 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.006 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.077 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.078 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805987.822288, 47ab2247-80c1-4a5a-ac41-b93c94e53ab7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.079 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] VM Paused (Lifecycle Event)
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.088 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.089 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.090 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.090 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.091 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.092 186985 DEBUG nova.virt.libvirt.driver [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.099 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.103 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763805987.9282968, 47ab2247-80c1-4a5a-ac41-b93c94e53ab7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.104 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] VM Resumed (Lifecycle Event)
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.126 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.131 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.155 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.164 186985 INFO nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Took 6.78 seconds to spawn the instance on the hypervisor.
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.165 186985 DEBUG nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.237 186985 INFO nova.compute.manager [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Took 7.25 seconds to build instance.
Nov 22 10:06:28 compute-0 nova_compute[186981]: 2025-11-22 10:06:28.283 186985 DEBUG oslo_concurrency.lockutils [None req-e0c923f1-49a7-4f65-99d6-0cffc33a1437 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:29 compute-0 nova_compute[186981]: 2025-11-22 10:06:29.994 186985 DEBUG nova.compute.manager [req-acac125d-a4ae-48b6-b003-b2acc1a1e735 req-f7861656-8bd3-4dba-8c5c-84157b2bd11e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:29 compute-0 nova_compute[186981]: 2025-11-22 10:06:29.994 186985 DEBUG oslo_concurrency.lockutils [req-acac125d-a4ae-48b6-b003-b2acc1a1e735 req-f7861656-8bd3-4dba-8c5c-84157b2bd11e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:29 compute-0 nova_compute[186981]: 2025-11-22 10:06:29.994 186985 DEBUG oslo_concurrency.lockutils [req-acac125d-a4ae-48b6-b003-b2acc1a1e735 req-f7861656-8bd3-4dba-8c5c-84157b2bd11e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:29 compute-0 nova_compute[186981]: 2025-11-22 10:06:29.994 186985 DEBUG oslo_concurrency.lockutils [req-acac125d-a4ae-48b6-b003-b2acc1a1e735 req-f7861656-8bd3-4dba-8c5c-84157b2bd11e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:29 compute-0 nova_compute[186981]: 2025-11-22 10:06:29.995 186985 DEBUG nova.compute.manager [req-acac125d-a4ae-48b6-b003-b2acc1a1e735 req-f7861656-8bd3-4dba-8c5c-84157b2bd11e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] No waiting events found dispatching network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:06:29 compute-0 nova_compute[186981]: 2025-11-22 10:06:29.995 186985 WARNING nova.compute.manager [req-acac125d-a4ae-48b6-b003-b2acc1a1e735 req-f7861656-8bd3-4dba-8c5c-84157b2bd11e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received unexpected event network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 for instance with vm_state active and task_state None.
Nov 22 10:06:30 compute-0 nova_compute[186981]: 2025-11-22 10:06:30.264 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:30 compute-0 podman[215577]: 2025-11-22 10:06:30.677774622 +0000 UTC m=+0.113603490 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 10:06:30 compute-0 podman[215576]: 2025-11-22 10:06:30.685383249 +0000 UTC m=+0.115924144 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 10:06:30 compute-0 nova_compute[186981]: 2025-11-22 10:06:30.903 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:32 compute-0 nova_compute[186981]: 2025-11-22 10:06:32.830 186985 DEBUG nova.compute.manager [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-changed-13437b40-3ed5-4ffe-8109-3d02908c4aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:32 compute-0 nova_compute[186981]: 2025-11-22 10:06:32.831 186985 DEBUG nova.compute.manager [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Refreshing instance network info cache due to event network-changed-13437b40-3ed5-4ffe-8109-3d02908c4aa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:06:32 compute-0 nova_compute[186981]: 2025-11-22 10:06:32.831 186985 DEBUG oslo_concurrency.lockutils [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:06:32 compute-0 nova_compute[186981]: 2025-11-22 10:06:32.831 186985 DEBUG oslo_concurrency.lockutils [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:06:32 compute-0 nova_compute[186981]: 2025-11-22 10:06:32.832 186985 DEBUG nova.network.neutron [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Refreshing network info cache for port 13437b40-3ed5-4ffe-8109-3d02908c4aa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:06:35 compute-0 nova_compute[186981]: 2025-11-22 10:06:35.071 186985 DEBUG nova.network.neutron [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Updated VIF entry in instance network info cache for port 13437b40-3ed5-4ffe-8109-3d02908c4aa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:06:35 compute-0 nova_compute[186981]: 2025-11-22 10:06:35.072 186985 DEBUG nova.network.neutron [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Updating instance_info_cache with network_info: [{"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:06:35 compute-0 nova_compute[186981]: 2025-11-22 10:06:35.088 186985 DEBUG oslo_concurrency.lockutils [req-94a6dec2-c6f0-46aa-80c6-1c0ac70f2fce req-31e56fd7-8c3f-4509-ad5c-d165b3ab37d3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-47ab2247-80c1-4a5a-ac41-b93c94e53ab7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:06:35 compute-0 nova_compute[186981]: 2025-11-22 10:06:35.310 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:35 compute-0 nova_compute[186981]: 2025-11-22 10:06:35.905 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:36 compute-0 podman[215620]: 2025-11-22 10:06:36.67644483 +0000 UTC m=+0.110980859 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 10:06:36 compute-0 podman[215621]: 2025-11-22 10:06:36.687338106 +0000 UTC m=+0.117010993 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 22 10:06:39 compute-0 podman[215669]: 2025-11-22 10:06:39.609144947 +0000 UTC m=+0.057431883 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:06:39 compute-0 podman[215670]: 2025-11-22 10:06:39.645741772 +0000 UTC m=+0.078202228 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 10:06:40 compute-0 nova_compute[186981]: 2025-11-22 10:06:40.311 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:40 compute-0 nova_compute[186981]: 2025-11-22 10:06:40.908 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:41 compute-0 ovn_controller[95329]: 2025-11-22T10:06:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:dc:20 10.100.0.14
Nov 22 10:06:41 compute-0 ovn_controller[95329]: 2025-11-22T10:06:41Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:dc:20 10.100.0.14
Nov 22 10:06:45 compute-0 nova_compute[186981]: 2025-11-22 10:06:45.310 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:45 compute-0 nova_compute[186981]: 2025-11-22 10:06:45.911 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.314 186985 INFO nova.compute.manager [None req-1a152c11-d067-4fb8-892c-47c41ed4ce73 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Get console output
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.321 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.556 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.556 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.557 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.558 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.558 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.560 186985 INFO nova.compute.manager [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Terminating instance
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.562 186985 DEBUG nova.compute.manager [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:06:47 compute-0 kernel: tap13437b40-3e (unregistering): left promiscuous mode
Nov 22 10:06:47 compute-0 NetworkManager[55425]: <info>  [1763806007.5893] device (tap13437b40-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:06:47 compute-0 ovn_controller[95329]: 2025-11-22T10:06:47Z|00083|binding|INFO|Releasing lport 13437b40-3ed5-4ffe-8109-3d02908c4aa1 from this chassis (sb_readonly=0)
Nov 22 10:06:47 compute-0 ovn_controller[95329]: 2025-11-22T10:06:47Z|00084|binding|INFO|Setting lport 13437b40-3ed5-4ffe-8109-3d02908c4aa1 down in Southbound
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.600 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 ovn_controller[95329]: 2025-11-22T10:06:47Z|00085|binding|INFO|Removing iface tap13437b40-3e ovn-installed in OVS
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.603 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.611 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:dc:20 10.100.0.14'], port_security=['fa:16:3e:d4:dc:20 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '47ab2247-80c1-4a5a-ac41-b93c94e53ab7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f275041e-3760-4f44-8319-d9af90fb73f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eaf7449-26c5-4408-9e7a-284a6f6737da, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=13437b40-3ed5-4ffe-8109-3d02908c4aa1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.613 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 13437b40-3ed5-4ffe-8109-3d02908c4aa1 in datapath f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 unbound from our chassis
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.613 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.615 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f23eafbf-d3c9-4bb3-9fb9-34bdd735a136
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.641 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[49422031-81c6-43ae-922e-aecb3ca50fb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:47 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 22 10:06:47 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 13.518s CPU time.
Nov 22 10:06:47 compute-0 systemd-machined[153303]: Machine qemu-5-instance-00000005 terminated.
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.672 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[73c29f79-b4b9-449f-9a42-738c873b6b60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.675 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c512c0-b95b-47e8-9cbb-d686821c178d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.702 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[33a49b65-9876-40f3-9fe6-f68d2d01872b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.722 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f436002a-c06d-48b4-a5fd-08f88822facf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf23eafbf-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:f6:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340756, 'reachable_time': 26942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215726, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.738 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a153dd-8ed2-4ffb-8ebd-26ee1961ee9f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf23eafbf-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340768, 'tstamp': 340768}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215727, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf23eafbf-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340772, 'tstamp': 340772}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215727, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.739 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf23eafbf-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.740 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.745 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.746 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf23eafbf-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.746 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.746 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf23eafbf-d0, col_values=(('external_ids', {'iface-id': 'c7e96b5d-1547-4265-a12b-bb708976c4c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:47.746 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.785 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.789 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.828 186985 INFO nova.virt.libvirt.driver [-] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Instance destroyed successfully.
Nov 22 10:06:47 compute-0 nova_compute[186981]: 2025-11-22 10:06:47.830 186985 DEBUG nova.objects.instance [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 47ab2247-80c1-4a5a-ac41-b93c94e53ab7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.092 186985 DEBUG nova.virt.libvirt.vif [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1975310769',display_name='tempest-TestNetworkBasicOps-server-1975310769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1975310769',id=5,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5XBGAgnB/b5ki+q3pUCkZej9h4dyoq9LnrHwCD9fgnuhg8QvNU/A/aOIyxacI+Hp16JPzSSOVCOrLz+ew4MgvtnZZhpV+1cCPdOIyR3zWJalxYpF7WfRl6IM0GTg+NmA==',key_name='tempest-TestNetworkBasicOps-2071597708',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:06:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-2hrszsb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:06:28Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=47ab2247-80c1-4a5a-ac41-b93c94e53ab7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.093 186985 DEBUG nova.network.os_vif_util [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "address": "fa:16:3e:d4:dc:20", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13437b40-3e", "ovs_interfaceid": "13437b40-3ed5-4ffe-8109-3d02908c4aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.094 186985 DEBUG nova.network.os_vif_util [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:dc:20,bridge_name='br-int',has_traffic_filtering=True,id=13437b40-3ed5-4ffe-8109-3d02908c4aa1,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13437b40-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.094 186985 DEBUG os_vif [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:dc:20,bridge_name='br-int',has_traffic_filtering=True,id=13437b40-3ed5-4ffe-8109-3d02908c4aa1,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13437b40-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.096 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.097 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13437b40-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.100 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.102 186985 INFO os_vif [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:dc:20,bridge_name='br-int',has_traffic_filtering=True,id=13437b40-3ed5-4ffe-8109-3d02908c4aa1,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13437b40-3e')
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.103 186985 INFO nova.virt.libvirt.driver [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Deleting instance files /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7_del
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.103 186985 INFO nova.virt.libvirt.driver [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Deletion of /var/lib/nova/instances/47ab2247-80c1-4a5a-ac41-b93c94e53ab7_del complete
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.460 186985 INFO nova.compute.manager [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Took 0.90 seconds to destroy the instance on the hypervisor.
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.460 186985 DEBUG oslo.service.loopingcall [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.461 186985 DEBUG nova.compute.manager [-] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.461 186985 DEBUG nova.network.neutron [-] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.705 186985 DEBUG nova.compute.manager [req-ed629f16-b3c2-4140-b405-ab3b6feff118 req-b4f78da2-7d15-4dc9-9220-b1aa395f3012 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-vif-unplugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.705 186985 DEBUG oslo_concurrency.lockutils [req-ed629f16-b3c2-4140-b405-ab3b6feff118 req-b4f78da2-7d15-4dc9-9220-b1aa395f3012 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.706 186985 DEBUG oslo_concurrency.lockutils [req-ed629f16-b3c2-4140-b405-ab3b6feff118 req-b4f78da2-7d15-4dc9-9220-b1aa395f3012 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.706 186985 DEBUG oslo_concurrency.lockutils [req-ed629f16-b3c2-4140-b405-ab3b6feff118 req-b4f78da2-7d15-4dc9-9220-b1aa395f3012 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.706 186985 DEBUG nova.compute.manager [req-ed629f16-b3c2-4140-b405-ab3b6feff118 req-b4f78da2-7d15-4dc9-9220-b1aa395f3012 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] No waiting events found dispatching network-vif-unplugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:06:48 compute-0 nova_compute[186981]: 2025-11-22 10:06:48.707 186985 DEBUG nova.compute.manager [req-ed629f16-b3c2-4140-b405-ab3b6feff118 req-b4f78da2-7d15-4dc9-9220-b1aa395f3012 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-vif-unplugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:06:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:49.761 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:06:49 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:49.762 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:06:49 compute-0 nova_compute[186981]: 2025-11-22 10:06:49.799 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:50 compute-0 nova_compute[186981]: 2025-11-22 10:06:50.311 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:50 compute-0 nova_compute[186981]: 2025-11-22 10:06:50.917 186985 DEBUG nova.compute.manager [req-05a93f1f-1ea8-47e7-8a4e-7240824bc08c req-40b711f0-241f-45f6-a8db-318eb6af1a73 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:50 compute-0 nova_compute[186981]: 2025-11-22 10:06:50.918 186985 DEBUG oslo_concurrency.lockutils [req-05a93f1f-1ea8-47e7-8a4e-7240824bc08c req-40b711f0-241f-45f6-a8db-318eb6af1a73 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:50 compute-0 nova_compute[186981]: 2025-11-22 10:06:50.918 186985 DEBUG oslo_concurrency.lockutils [req-05a93f1f-1ea8-47e7-8a4e-7240824bc08c req-40b711f0-241f-45f6-a8db-318eb6af1a73 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:50 compute-0 nova_compute[186981]: 2025-11-22 10:06:50.918 186985 DEBUG oslo_concurrency.lockutils [req-05a93f1f-1ea8-47e7-8a4e-7240824bc08c req-40b711f0-241f-45f6-a8db-318eb6af1a73 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:50 compute-0 nova_compute[186981]: 2025-11-22 10:06:50.919 186985 DEBUG nova.compute.manager [req-05a93f1f-1ea8-47e7-8a4e-7240824bc08c req-40b711f0-241f-45f6-a8db-318eb6af1a73 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] No waiting events found dispatching network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:06:50 compute-0 nova_compute[186981]: 2025-11-22 10:06:50.919 186985 WARNING nova.compute.manager [req-05a93f1f-1ea8-47e7-8a4e-7240824bc08c req-40b711f0-241f-45f6-a8db-318eb6af1a73 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received unexpected event network-vif-plugged-13437b40-3ed5-4ffe-8109-3d02908c4aa1 for instance with vm_state active and task_state deleting.
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.040 186985 DEBUG nova.network.neutron [-] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.066 186985 INFO nova.compute.manager [-] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Took 2.61 seconds to deallocate network for instance.
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.125 186985 DEBUG nova.compute.manager [req-4e092af4-f398-4d36-ae45-31e8b888f619 req-712f17bf-7d1b-441d-8e93-c5d5718f9c5f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Received event network-vif-deleted-13437b40-3ed5-4ffe-8109-3d02908c4aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.127 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.127 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.210 186985 DEBUG nova.compute.provider_tree [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.249 186985 DEBUG nova.scheduler.client.report [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.297 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.329 186985 INFO nova.scheduler.client.report [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 47ab2247-80c1-4a5a-ac41-b93c94e53ab7
Nov 22 10:06:51 compute-0 nova_compute[186981]: 2025-11-22 10:06:51.413 186985 DEBUG oslo_concurrency.lockutils [None req-eed117eb-1073-43e1-a3a9-fba8524cc597 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "47ab2247-80c1-4a5a-ac41-b93c94e53ab7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:52 compute-0 podman[215745]: 2025-11-22 10:06:52.642485187 +0000 UTC m=+0.091972503 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.099 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.185 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.186 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.187 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.187 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.188 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.189 186985 INFO nova.compute.manager [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Terminating instance
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.191 186985 DEBUG nova.compute.manager [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:06:53 compute-0 kernel: tapb080d9aa-2c (unregistering): left promiscuous mode
Nov 22 10:06:53 compute-0 NetworkManager[55425]: <info>  [1763806013.2233] device (tapb080d9aa-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:06:53 compute-0 ovn_controller[95329]: 2025-11-22T10:06:53Z|00086|binding|INFO|Releasing lport b080d9aa-2ce8-4a11-9f13-796159a6e632 from this chassis (sb_readonly=0)
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.275 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 ovn_controller[95329]: 2025-11-22T10:06:53Z|00087|binding|INFO|Setting lport b080d9aa-2ce8-4a11-9f13-796159a6e632 down in Southbound
Nov 22 10:06:53 compute-0 ovn_controller[95329]: 2025-11-22T10:06:53Z|00088|binding|INFO|Removing iface tapb080d9aa-2c ovn-installed in OVS
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.278 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:53.284 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:3c:fe 10.100.0.8'], port_security=['fa:16:3e:d4:3c:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '31123a76-87ae-4a5e-adb5-94bb94b3bc6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1f4b7f64-d796-4713-9fe1-bbaf401238e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eaf7449-26c5-4408-9e7a-284a6f6737da, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=b080d9aa-2ce8-4a11-9f13-796159a6e632) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:06:53 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:53.285 104216 INFO neutron.agent.ovn.metadata.agent [-] Port b080d9aa-2ce8-4a11-9f13-796159a6e632 in datapath f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 unbound from our chassis
Nov 22 10:06:53 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:53.286 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f23eafbf-d3c9-4bb3-9fb9-34bdd735a136, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:06:53 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:53.287 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[424b89b6-ae4a-44f6-920c-2fc19df96a6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:53 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:53.288 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 namespace which is not needed anymore
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.287 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 22 10:06:53 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 15.978s CPU time.
Nov 22 10:06:53 compute-0 systemd-machined[153303]: Machine qemu-4-instance-00000004 terminated.
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.415 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.421 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.447 186985 INFO nova.virt.libvirt.driver [-] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Instance destroyed successfully.
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.448 186985 DEBUG nova.objects.instance [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 31123a76-87ae-4a5e-adb5-94bb94b3bc6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.463 186985 DEBUG nova.virt.libvirt.vif [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1697286732',display_name='tempest-TestNetworkBasicOps-server-1697286732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1697286732',id=4,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO/TraUlcHr06zIR1/fy65BadMzxpUUGQrSin66VoKaXPnvQ1h05XrAHMaJIBha2hYo4NDBtQKvRkImpRpFOYS7fh90OolTkra8lDl3ROArQjfbVVcAzy9O1QGUoVCoevQ==',key_name='tempest-TestNetworkBasicOps-1593716210',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:05:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-mynju97c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:05:50Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=31123a76-87ae-4a5e-adb5-94bb94b3bc6f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.463 186985 DEBUG nova.network.os_vif_util [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "address": "fa:16:3e:d4:3c:fe", "network": {"id": "f23eafbf-d3c9-4bb3-9fb9-34bdd735a136", "bridge": "br-int", "label": "tempest-network-smoke--1752418140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb080d9aa-2c", "ovs_interfaceid": "b080d9aa-2ce8-4a11-9f13-796159a6e632", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.464 186985 DEBUG nova.network.os_vif_util [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:3c:fe,bridge_name='br-int',has_traffic_filtering=True,id=b080d9aa-2ce8-4a11-9f13-796159a6e632,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb080d9aa-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.464 186985 DEBUG os_vif [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:3c:fe,bridge_name='br-int',has_traffic_filtering=True,id=b080d9aa-2ce8-4a11-9f13-796159a6e632,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb080d9aa-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.466 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.466 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb080d9aa-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.467 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.468 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.470 186985 INFO os_vif [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:3c:fe,bridge_name='br-int',has_traffic_filtering=True,id=b080d9aa-2ce8-4a11-9f13-796159a6e632,network=Network(f23eafbf-d3c9-4bb3-9fb9-34bdd735a136),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb080d9aa-2c')
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.471 186985 INFO nova.virt.libvirt.driver [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Deleting instance files /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f_del
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.472 186985 INFO nova.virt.libvirt.driver [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Deletion of /var/lib/nova/instances/31123a76-87ae-4a5e-adb5-94bb94b3bc6f_del complete
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.514 186985 INFO nova.compute.manager [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Took 0.32 seconds to destroy the instance on the hypervisor.
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.514 186985 DEBUG oslo.service.loopingcall [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.515 186985 DEBUG nova.compute.manager [-] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.515 186985 DEBUG nova.network.neutron [-] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.868 186985 DEBUG nova.compute.manager [req-c58d2e36-7d0e-456e-95aa-2454b040c85b req-56e75227-ec3b-4b92-828e-54adabb89d68 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-vif-unplugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.868 186985 DEBUG oslo_concurrency.lockutils [req-c58d2e36-7d0e-456e-95aa-2454b040c85b req-56e75227-ec3b-4b92-828e-54adabb89d68 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.868 186985 DEBUG oslo_concurrency.lockutils [req-c58d2e36-7d0e-456e-95aa-2454b040c85b req-56e75227-ec3b-4b92-828e-54adabb89d68 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.869 186985 DEBUG oslo_concurrency.lockutils [req-c58d2e36-7d0e-456e-95aa-2454b040c85b req-56e75227-ec3b-4b92-828e-54adabb89d68 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.869 186985 DEBUG nova.compute.manager [req-c58d2e36-7d0e-456e-95aa-2454b040c85b req-56e75227-ec3b-4b92-828e-54adabb89d68 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] No waiting events found dispatching network-vif-unplugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:06:53 compute-0 nova_compute[186981]: 2025-11-22 10:06:53.869 186985 DEBUG nova.compute.manager [req-c58d2e36-7d0e-456e-95aa-2454b040c85b req-56e75227-ec3b-4b92-828e-54adabb89d68 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-vif-unplugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.210 186985 DEBUG nova.network.neutron [-] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.223 186985 INFO nova.compute.manager [-] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Took 0.71 seconds to deallocate network for instance.
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.268 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.268 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.307 186985 DEBUG nova.compute.provider_tree [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.318 186985 DEBUG nova.scheduler.client.report [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.337 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.365 186985 INFO nova.scheduler.client.report [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 31123a76-87ae-4a5e-adb5-94bb94b3bc6f
Nov 22 10:06:54 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [NOTICE]   (215330) : haproxy version is 2.8.14-c23fe91
Nov 22 10:06:54 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [NOTICE]   (215330) : path to executable is /usr/sbin/haproxy
Nov 22 10:06:54 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [WARNING]  (215330) : Exiting Master process...
Nov 22 10:06:54 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [WARNING]  (215330) : Exiting Master process...
Nov 22 10:06:54 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [ALERT]    (215330) : Current worker (215332) exited with code 143 (Terminated)
Nov 22 10:06:54 compute-0 neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136[215326]: [WARNING]  (215330) : All workers exited. Exiting... (0)
Nov 22 10:06:54 compute-0 systemd[1]: libpod-f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850.scope: Deactivated successfully.
Nov 22 10:06:54 compute-0 podman[215794]: 2025-11-22 10:06:54.421723244 +0000 UTC m=+1.046960054 container died f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.426 186985 DEBUG oslo_concurrency.lockutils [None req-b85cf460-bd16-447c-988f-4923ad480d48 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850-userdata-shm.mount: Deactivated successfully.
Nov 22 10:06:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-525336be04cadd96ba03b1cd31428e5dfb8d6be567d17ee9e434c1f741230a4c-merged.mount: Deactivated successfully.
Nov 22 10:06:54 compute-0 podman[215794]: 2025-11-22 10:06:54.452913592 +0000 UTC m=+1.078150392 container cleanup f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 10:06:54 compute-0 systemd[1]: libpod-conmon-f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850.scope: Deactivated successfully.
Nov 22 10:06:54 compute-0 podman[215841]: 2025-11-22 10:06:54.505597305 +0000 UTC m=+0.033974585 container remove f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.510 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[bc038a6e-24da-44cb-aa26-a51abe3fe441]: (4, ('Sat Nov 22 10:06:53 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 (f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850)\nf4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850\nSat Nov 22 10:06:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 (f4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850)\nf4273f8272c65cf2d5800850a657e07323f91c6d1814b1575f9c4689235dd850\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.511 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[10642522-68b5-4bfc-a3a7-b4421b66b163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.512 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf23eafbf-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:54 compute-0 kernel: tapf23eafbf-d0: left promiscuous mode
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.515 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.518 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[35d6d76a-3441-4305-bb10-3160af01048e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:54 compute-0 nova_compute[186981]: 2025-11-22 10:06:54.525 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.541 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d10af9-76ee-4e38-ba21-57b531944fac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.542 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[faeb7026-3c05-4a6a-b95f-1846c59e7720]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.554 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[17d45050-a1bd-4c2e-9d99-c0472fbb9969]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340750, 'reachable_time': 41511, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215857, 'error': None, 'target': 'ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.557 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f23eafbf-d3c9-4bb3-9fb9-34bdd735a136 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:06:54 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:54.557 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[94545fab-bf19-454a-9496-47b8be17602e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:06:54 compute-0 systemd[1]: run-netns-ovnmeta\x2df23eafbf\x2dd3c9\x2d4bb3\x2d9fb9\x2d34bdd735a136.mount: Deactivated successfully.
Nov 22 10:06:55 compute-0 nova_compute[186981]: 2025-11-22 10:06:55.313 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:56 compute-0 nova_compute[186981]: 2025-11-22 10:06:56.331 186985 DEBUG nova.compute.manager [req-db0d4561-9ba7-47b6-9bb1-8a708a194c98 req-4730884b-5110-44d0-88b9-61624b8bfd27 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:56 compute-0 nova_compute[186981]: 2025-11-22 10:06:56.331 186985 DEBUG oslo_concurrency.lockutils [req-db0d4561-9ba7-47b6-9bb1-8a708a194c98 req-4730884b-5110-44d0-88b9-61624b8bfd27 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:06:56 compute-0 nova_compute[186981]: 2025-11-22 10:06:56.332 186985 DEBUG oslo_concurrency.lockutils [req-db0d4561-9ba7-47b6-9bb1-8a708a194c98 req-4730884b-5110-44d0-88b9-61624b8bfd27 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:06:56 compute-0 nova_compute[186981]: 2025-11-22 10:06:56.332 186985 DEBUG oslo_concurrency.lockutils [req-db0d4561-9ba7-47b6-9bb1-8a708a194c98 req-4730884b-5110-44d0-88b9-61624b8bfd27 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "31123a76-87ae-4a5e-adb5-94bb94b3bc6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:06:56 compute-0 nova_compute[186981]: 2025-11-22 10:06:56.333 186985 DEBUG nova.compute.manager [req-db0d4561-9ba7-47b6-9bb1-8a708a194c98 req-4730884b-5110-44d0-88b9-61624b8bfd27 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] No waiting events found dispatching network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:06:56 compute-0 nova_compute[186981]: 2025-11-22 10:06:56.333 186985 WARNING nova.compute.manager [req-db0d4561-9ba7-47b6-9bb1-8a708a194c98 req-4730884b-5110-44d0-88b9-61624b8bfd27 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received unexpected event network-vif-plugged-b080d9aa-2ce8-4a11-9f13-796159a6e632 for instance with vm_state deleted and task_state None.
Nov 22 10:06:56 compute-0 nova_compute[186981]: 2025-11-22 10:06:56.334 186985 DEBUG nova.compute.manager [req-db0d4561-9ba7-47b6-9bb1-8a708a194c98 req-4730884b-5110-44d0-88b9-61624b8bfd27 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Received event network-vif-deleted-b080d9aa-2ce8-4a11-9f13-796159a6e632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:06:57 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:06:57.764 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:06:58 compute-0 nova_compute[186981]: 2025-11-22 10:06:58.470 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:58 compute-0 nova_compute[186981]: 2025-11-22 10:06:58.479 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:06:58 compute-0 nova_compute[186981]: 2025-11-22 10:06:58.585 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:00 compute-0 nova_compute[186981]: 2025-11-22 10:07:00.315 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:01 compute-0 podman[215859]: 2025-11-22 10:07:01.610360573 +0000 UTC m=+0.064819773 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 22 10:07:01 compute-0 podman[215860]: 2025-11-22 10:07:01.639466174 +0000 UTC m=+0.093285137 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 10:07:02 compute-0 nova_compute[186981]: 2025-11-22 10:07:02.825 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806007.8245869, 47ab2247-80c1-4a5a-ac41-b93c94e53ab7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:07:02 compute-0 nova_compute[186981]: 2025-11-22 10:07:02.826 186985 INFO nova.compute.manager [-] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] VM Stopped (Lifecycle Event)
Nov 22 10:07:02 compute-0 nova_compute[186981]: 2025-11-22 10:07:02.849 186985 DEBUG nova.compute.manager [None req-af8713d5-1632-43e1-b327-577d664db5f7 - - - - - -] [instance: 47ab2247-80c1-4a5a-ac41-b93c94e53ab7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:07:03 compute-0 nova_compute[186981]: 2025-11-22 10:07:03.472 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:05 compute-0 nova_compute[186981]: 2025-11-22 10:07:05.317 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:07 compute-0 podman[215905]: 2025-11-22 10:07:07.607196261 +0000 UTC m=+0.053312241 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 10:07:07 compute-0 podman[215906]: 2025-11-22 10:07:07.639994292 +0000 UTC m=+0.073513791 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter)
Nov 22 10:07:08 compute-0 nova_compute[186981]: 2025-11-22 10:07:08.446 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806013.445239, 31123a76-87ae-4a5e-adb5-94bb94b3bc6f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:07:08 compute-0 nova_compute[186981]: 2025-11-22 10:07:08.447 186985 INFO nova.compute.manager [-] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] VM Stopped (Lifecycle Event)
Nov 22 10:07:08 compute-0 nova_compute[186981]: 2025-11-22 10:07:08.473 186985 DEBUG nova.compute.manager [None req-a03371a2-e10d-429b-aa21-0444e3e8647b - - - - - -] [instance: 31123a76-87ae-4a5e-adb5-94bb94b3bc6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:07:08 compute-0 nova_compute[186981]: 2025-11-22 10:07:08.476 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:10 compute-0 nova_compute[186981]: 2025-11-22 10:07:10.319 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:10 compute-0 podman[215946]: 2025-11-22 10:07:10.61314059 +0000 UTC m=+0.060972020 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:07:10 compute-0 podman[215945]: 2025-11-22 10:07:10.614169777 +0000 UTC m=+0.061159894 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:07:11 compute-0 nova_compute[186981]: 2025-11-22 10:07:11.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:11 compute-0 nova_compute[186981]: 2025-11-22 10:07:11.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:07:11 compute-0 nova_compute[186981]: 2025-11-22 10:07:11.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:07:11 compute-0 nova_compute[186981]: 2025-11-22 10:07:11.608 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:07:12 compute-0 nova_compute[186981]: 2025-11-22 10:07:12.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:12 compute-0 nova_compute[186981]: 2025-11-22 10:07:12.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:12 compute-0 nova_compute[186981]: 2025-11-22 10:07:12.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 10:07:12 compute-0 nova_compute[186981]: 2025-11-22 10:07:12.620 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 10:07:13 compute-0 nova_compute[186981]: 2025-11-22 10:07:13.478 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.377 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.377 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.394 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.571 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.572 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.579 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.579 186985 INFO nova.compute.claims [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.620 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.816 186985 DEBUG nova.compute.provider_tree [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.833 186985 DEBUG nova.scheduler.client.report [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.864 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.865 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.868 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.868 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.868 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.918 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.918 186985 DEBUG nova.network.neutron [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.937 186985 INFO nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:07:14 compute-0 nova_compute[186981]: 2025-11-22 10:07:14.959 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.033 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.035 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.035 186985 INFO nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Creating image(s)
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.035 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.036 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.036 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.047 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.094 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.095 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5766MB free_disk=73.45891952514648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.095 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.095 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.102 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.103 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.103 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.115 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.169 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.169 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.188 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance d5460be9-d4a4-45e1-8bd1-99144801279c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.189 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.189 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.203 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.204 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.204 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.245 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.258 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.258 186985 DEBUG nova.virt.disk.api [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.259 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.277 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.302 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.302 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.303 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.320 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.333 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.333 186985 DEBUG nova.virt.disk.api [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.334 186985 DEBUG nova.objects.instance [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.347 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.348 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Ensure instance console log exists: /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.348 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.349 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.349 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:15 compute-0 nova_compute[186981]: 2025-11-22 10:07:15.681 186985 DEBUG nova.policy [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:07:16 compute-0 nova_compute[186981]: 2025-11-22 10:07:16.312 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:16 compute-0 nova_compute[186981]: 2025-11-22 10:07:16.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:16 compute-0 nova_compute[186981]: 2025-11-22 10:07:16.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:16 compute-0 nova_compute[186981]: 2025-11-22 10:07:16.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:07:17 compute-0 nova_compute[186981]: 2025-11-22 10:07:17.591 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:17 compute-0 nova_compute[186981]: 2025-11-22 10:07:17.591 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:17 compute-0 nova_compute[186981]: 2025-11-22 10:07:17.786 186985 DEBUG nova.network.neutron [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Successfully created port: b4bd60c8-946f-4124-b413-02ee57a5b597 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:07:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:17.935 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:17.935 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:17.936 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.481 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.809 186985 DEBUG nova.network.neutron [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Successfully updated port: b4bd60c8-946f-4124-b413-02ee57a5b597 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.832 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.833 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.833 186985 DEBUG nova.network.neutron [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.925 186985 DEBUG nova.compute.manager [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-changed-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.926 186985 DEBUG nova.compute.manager [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing instance network info cache due to event network-changed-b4bd60c8-946f-4124-b413-02ee57a5b597. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:07:18 compute-0 nova_compute[186981]: 2025-11-22 10:07:18.926 186985 DEBUG oslo_concurrency.lockutils [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:07:19 compute-0 nova_compute[186981]: 2025-11-22 10:07:19.298 186985 DEBUG nova.network.neutron [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.323 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.406 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.428 186985 WARNING nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.429 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Triggering sync for uuid d5460be9-d4a4-45e1-8bd1-99144801279c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.430 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.681 186985 DEBUG nova.network.neutron [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.702 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.702 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Instance network_info: |[{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.703 186985 DEBUG oslo_concurrency.lockutils [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.703 186985 DEBUG nova.network.neutron [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing network info cache for port b4bd60c8-946f-4124-b413-02ee57a5b597 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.709 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Start _get_guest_xml network_info=[{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.716 186985 WARNING nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.722 186985 DEBUG nova.virt.libvirt.host [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.723 186985 DEBUG nova.virt.libvirt.host [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.734 186985 DEBUG nova.virt.libvirt.host [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.735 186985 DEBUG nova.virt.libvirt.host [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.736 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.736 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.738 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.738 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.738 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.739 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.739 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.740 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.740 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.741 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.741 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.741 186985 DEBUG nova.virt.hardware [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.747 186985 DEBUG nova.virt.libvirt.vif [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:07:14Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.748 186985 DEBUG nova.network.os_vif_util [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.749 186985 DEBUG nova.network.os_vif_util [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:6e:90,bridge_name='br-int',has_traffic_filtering=True,id=b4bd60c8-946f-4124-b413-02ee57a5b597,network=Network(3b46282d-b3ed-40b7-90ce-65aaeac61049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4bd60c8-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.750 186985 DEBUG nova.objects.instance [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.768 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <uuid>d5460be9-d4a4-45e1-8bd1-99144801279c</uuid>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <name>instance-00000006</name>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:07:20</nova:creationTime>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:07:20 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <system>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <entry name="serial">d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <entry name="uuid">d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </system>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <os>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   </os>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <features>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   </features>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:d0:6e:90"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <target dev="tapb4bd60c8-94"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log" append="off"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <video>
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </video>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:07:20 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:07:20 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:07:20 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:07:20 compute-0 nova_compute[186981]: </domain>
Nov 22 10:07:20 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.770 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Preparing to wait for external event network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.770 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.770 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.770 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.771 186985 DEBUG nova.virt.libvirt.vif [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:07:14Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.771 186985 DEBUG nova.network.os_vif_util [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.772 186985 DEBUG nova.network.os_vif_util [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:6e:90,bridge_name='br-int',has_traffic_filtering=True,id=b4bd60c8-946f-4124-b413-02ee57a5b597,network=Network(3b46282d-b3ed-40b7-90ce-65aaeac61049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4bd60c8-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.772 186985 DEBUG os_vif [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:6e:90,bridge_name='br-int',has_traffic_filtering=True,id=b4bd60c8-946f-4124-b413-02ee57a5b597,network=Network(3b46282d-b3ed-40b7-90ce-65aaeac61049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4bd60c8-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.773 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.773 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.774 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.777 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.777 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb4bd60c8-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.777 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb4bd60c8-94, col_values=(('external_ids', {'iface-id': 'b4bd60c8-946f-4124-b413-02ee57a5b597', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:6e:90', 'vm-uuid': 'd5460be9-d4a4-45e1-8bd1-99144801279c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.779 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:20 compute-0 NetworkManager[55425]: <info>  [1763806040.7806] manager: (tapb4bd60c8-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.782 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.788 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.790 186985 INFO os_vif [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:6e:90,bridge_name='br-int',has_traffic_filtering=True,id=b4bd60c8-946f-4124-b413-02ee57a5b597,network=Network(3b46282d-b3ed-40b7-90ce-65aaeac61049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4bd60c8-94')
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.849 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.850 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.850 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:d0:6e:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:07:20 compute-0 nova_compute[186981]: 2025-11-22 10:07:20.851 186985 INFO nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Using config drive
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.677 186985 INFO nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Creating config drive at /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.686 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe0_9f3xk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.815 186985 DEBUG oslo_concurrency.processutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe0_9f3xk" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:07:21 compute-0 kernel: tapb4bd60c8-94: entered promiscuous mode
Nov 22 10:07:21 compute-0 NetworkManager[55425]: <info>  [1763806041.8816] manager: (tapb4bd60c8-94): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Nov 22 10:07:21 compute-0 ovn_controller[95329]: 2025-11-22T10:07:21Z|00089|binding|INFO|Claiming lport b4bd60c8-946f-4124-b413-02ee57a5b597 for this chassis.
Nov 22 10:07:21 compute-0 ovn_controller[95329]: 2025-11-22T10:07:21Z|00090|binding|INFO|b4bd60c8-946f-4124-b413-02ee57a5b597: Claiming fa:16:3e:d0:6e:90 10.100.0.7
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.880 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.886 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.889 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.893 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.906 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:6e:90 10.100.0.7'], port_security=['fa:16:3e:d0:6e:90 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f5906cda-5c1a-4e21-9e63-b78db27a3837', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aae3095-cd5a-4c64-be68-4ceb75b321b5, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=b4bd60c8-946f-4124-b413-02ee57a5b597) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.907 104216 INFO neutron.agent.ovn.metadata.agent [-] Port b4bd60c8-946f-4124-b413-02ee57a5b597 in datapath 3b46282d-b3ed-40b7-90ce-65aaeac61049 bound to our chassis
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.908 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3b46282d-b3ed-40b7-90ce-65aaeac61049
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.920 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb81961-df5d-489b-a9b4-f1cff109f4a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:21 compute-0 systemd-udevd[216027]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.922 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3b46282d-b1 in ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.926 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3b46282d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.926 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[63d1ecbd-dbd4-4fa1-8958-5dde991e7f0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:21 compute-0 systemd-machined[153303]: New machine qemu-6-instance-00000006.
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.927 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[27a61b64-f333-4a11-be7a-1411e1974b87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:21 compute-0 NetworkManager[55425]: <info>  [1763806041.9377] device (tapb4bd60c8-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:07:21 compute-0 NetworkManager[55425]: <info>  [1763806041.9384] device (tapb4bd60c8-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.945 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[b78b2aae-2993-4d07-93f2-7ca1c6189652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:21 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Nov 22 10:07:21 compute-0 ovn_controller[95329]: 2025-11-22T10:07:21Z|00091|binding|INFO|Setting lport b4bd60c8-946f-4124-b413-02ee57a5b597 ovn-installed in OVS
Nov 22 10:07:21 compute-0 ovn_controller[95329]: 2025-11-22T10:07:21Z|00092|binding|INFO|Setting lport b4bd60c8-946f-4124-b413-02ee57a5b597 up in Southbound
Nov 22 10:07:21 compute-0 nova_compute[186981]: 2025-11-22 10:07:21.960 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:21 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:21.970 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[384aec56-9782-4893-a8b5-c912b701b9dc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.002 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[828f5a1d-96df-4234-857c-2c17028adc68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 NetworkManager[55425]: <info>  [1763806042.0111] manager: (tap3b46282d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.010 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e15bad95-1611-43c9-88fe-c2180125dfe2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.045 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[78e23d3d-f9f2-43f6-8ed5-42a57faaba5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.048 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[b0652dad-b261-414b-8399-2970bf5f3da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 NetworkManager[55425]: <info>  [1763806042.0826] device (tap3b46282d-b0): carrier: link connected
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.088 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1f274d-955a-4d4c-96a3-5d4937414043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.112 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[dc87f02b-f28f-4b4e-bea1-cca5a5638568]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b46282d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:d5:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349978, 'reachable_time': 16696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216059, 'error': None, 'target': 'ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.134 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[727ee926-88a7-4eeb-a974-c71c90cfc453]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:d5e5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349978, 'tstamp': 349978}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216060, 'error': None, 'target': 'ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.152 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7793d0ed-6a64-4fb0-90a9-a880e19ccac4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b46282d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:d5:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349978, 'reachable_time': 16696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216061, 'error': None, 'target': 'ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.183 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0ceb5d-1bf6-4f13-8435-7ff13ca5f880]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.245 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ee659d-9fb2-473b-a8c0-7e34374143dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.246 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b46282d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.247 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.247 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b46282d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:22 compute-0 kernel: tap3b46282d-b0: entered promiscuous mode
Nov 22 10:07:22 compute-0 NetworkManager[55425]: <info>  [1763806042.2505] manager: (tap3b46282d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.249 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.252 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3b46282d-b0, col_values=(('external_ids', {'iface-id': '7e6fffde-8524-45a3-90aa-146144523c34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:22 compute-0 ovn_controller[95329]: 2025-11-22T10:07:22Z|00093|binding|INFO|Releasing lport 7e6fffde-8524-45a3-90aa-146144523c34 from this chassis (sb_readonly=0)
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.255 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3b46282d-b3ed-40b7-90ce-65aaeac61049.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3b46282d-b3ed-40b7-90ce-65aaeac61049.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.256 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5210f160-dc6b-41c9-804c-6decdef66160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.257 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-3b46282d-b3ed-40b7-90ce-65aaeac61049
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/3b46282d-b3ed-40b7-90ce-65aaeac61049.pid.haproxy
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID 3b46282d-b3ed-40b7-90ce-65aaeac61049
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:07:22 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:22.258 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'env', 'PROCESS_TAG=haproxy-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3b46282d-b3ed-40b7-90ce-65aaeac61049.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.267 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.542 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806042.5419607, d5460be9-d4a4-45e1-8bd1-99144801279c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.543 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] VM Started (Lifecycle Event)
Nov 22 10:07:22 compute-0 podman[216100]: 2025-11-22 10:07:22.617850336 +0000 UTC m=+0.047249597 container create be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.617 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.621 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806042.5422728, d5460be9-d4a4-45e1-8bd1-99144801279c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.622 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] VM Paused (Lifecycle Event)
Nov 22 10:07:22 compute-0 systemd[1]: Started libpod-conmon-be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47.scope.
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.680 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.685 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:07:22 compute-0 podman[216100]: 2025-11-22 10:07:22.591993033 +0000 UTC m=+0.021392274 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:07:22 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:07:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e90a496b9b3e18a8aeb44b5f0de1dddba49e03b6b283abcd947acbfb55608d3b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:07:22 compute-0 podman[216100]: 2025-11-22 10:07:22.721161105 +0000 UTC m=+0.150560406 container init be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:07:22 compute-0 podman[216100]: 2025-11-22 10:07:22.73015519 +0000 UTC m=+0.159554451 container start be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 10:07:22 compute-0 podman[216116]: 2025-11-22 10:07:22.757267097 +0000 UTC m=+0.068881264 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:07:22 compute-0 neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049[216115]: [NOTICE]   (216131) : New worker (216142) forked
Nov 22 10:07:22 compute-0 neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049[216115]: [NOTICE]   (216131) : Loading success.
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.869 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.912 186985 DEBUG nova.compute.manager [req-23a59bf5-e552-4def-945a-63ca804f0ce7 req-ca516014-3c2e-4016-ba5e-9cb9f9d91ba1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.913 186985 DEBUG oslo_concurrency.lockutils [req-23a59bf5-e552-4def-945a-63ca804f0ce7 req-ca516014-3c2e-4016-ba5e-9cb9f9d91ba1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.913 186985 DEBUG oslo_concurrency.lockutils [req-23a59bf5-e552-4def-945a-63ca804f0ce7 req-ca516014-3c2e-4016-ba5e-9cb9f9d91ba1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.914 186985 DEBUG oslo_concurrency.lockutils [req-23a59bf5-e552-4def-945a-63ca804f0ce7 req-ca516014-3c2e-4016-ba5e-9cb9f9d91ba1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.914 186985 DEBUG nova.compute.manager [req-23a59bf5-e552-4def-945a-63ca804f0ce7 req-ca516014-3c2e-4016-ba5e-9cb9f9d91ba1 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Processing event network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.915 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.919 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806042.919415, d5460be9-d4a4-45e1-8bd1-99144801279c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.919 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] VM Resumed (Lifecycle Event)
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.921 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.923 186985 INFO nova.virt.libvirt.driver [-] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Instance spawned successfully.
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.924 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.950 186985 DEBUG nova.network.neutron [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updated VIF entry in instance network info cache for port b4bd60c8-946f-4124-b413-02ee57a5b597. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:07:22 compute-0 nova_compute[186981]: 2025-11-22 10:07:22.951 186985 DEBUG nova.network.neutron [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.036 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.040 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.040 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.040 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.041 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.041 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.042 186985 DEBUG nova.virt.libvirt.driver [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.046 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.049 186985 DEBUG oslo_concurrency.lockutils [req-8730ce20-df35-4b56-812e-1fc7cc3a92f1 req-f04be224-80cc-4a20-ab9e-e2c8514189d5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.090 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.167 186985 INFO nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Took 8.13 seconds to spawn the instance on the hypervisor.
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.168 186985 DEBUG nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.317 186985 INFO nova.compute.manager [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Took 8.88 seconds to build instance.
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.411 186985 DEBUG oslo_concurrency.lockutils [None req-b8c56aaa-19f0-4011-8442-bf180c7079d7 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.412 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.413 186985 INFO nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:07:23 compute-0 nova_compute[186981]: 2025-11-22 10:07:23.413 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:24 compute-0 nova_compute[186981]: 2025-11-22 10:07:24.983 186985 DEBUG nova.compute.manager [req-589d65d9-e5c6-4716-8e96-7d2d8cb3f134 req-afb5b136-eff7-4c94-a8d4-058590d6cb91 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:07:24 compute-0 nova_compute[186981]: 2025-11-22 10:07:24.984 186985 DEBUG oslo_concurrency.lockutils [req-589d65d9-e5c6-4716-8e96-7d2d8cb3f134 req-afb5b136-eff7-4c94-a8d4-058590d6cb91 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:24 compute-0 nova_compute[186981]: 2025-11-22 10:07:24.985 186985 DEBUG oslo_concurrency.lockutils [req-589d65d9-e5c6-4716-8e96-7d2d8cb3f134 req-afb5b136-eff7-4c94-a8d4-058590d6cb91 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:24 compute-0 nova_compute[186981]: 2025-11-22 10:07:24.985 186985 DEBUG oslo_concurrency.lockutils [req-589d65d9-e5c6-4716-8e96-7d2d8cb3f134 req-afb5b136-eff7-4c94-a8d4-058590d6cb91 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:24 compute-0 nova_compute[186981]: 2025-11-22 10:07:24.986 186985 DEBUG nova.compute.manager [req-589d65d9-e5c6-4716-8e96-7d2d8cb3f134 req-afb5b136-eff7-4c94-a8d4-058590d6cb91 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] No waiting events found dispatching network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:07:24 compute-0 nova_compute[186981]: 2025-11-22 10:07:24.986 186985 WARNING nova.compute.manager [req-589d65d9-e5c6-4716-8e96-7d2d8cb3f134 req-afb5b136-eff7-4c94-a8d4-058590d6cb91 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received unexpected event network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 for instance with vm_state active and task_state None.
Nov 22 10:07:25 compute-0 nova_compute[186981]: 2025-11-22 10:07:25.376 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:25 compute-0 nova_compute[186981]: 2025-11-22 10:07:25.779 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:26 compute-0 ovn_controller[95329]: 2025-11-22T10:07:26Z|00094|binding|INFO|Releasing lport 7e6fffde-8524-45a3-90aa-146144523c34 from this chassis (sb_readonly=0)
Nov 22 10:07:26 compute-0 nova_compute[186981]: 2025-11-22 10:07:26.672 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:26 compute-0 NetworkManager[55425]: <info>  [1763806046.6798] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 22 10:07:26 compute-0 NetworkManager[55425]: <info>  [1763806046.6828] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 22 10:07:26 compute-0 ovn_controller[95329]: 2025-11-22T10:07:26Z|00095|binding|INFO|Releasing lport 7e6fffde-8524-45a3-90aa-146144523c34 from this chassis (sb_readonly=0)
Nov 22 10:07:26 compute-0 nova_compute[186981]: 2025-11-22 10:07:26.683 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:27 compute-0 nova_compute[186981]: 2025-11-22 10:07:27.094 186985 DEBUG nova.compute.manager [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-changed-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:07:27 compute-0 nova_compute[186981]: 2025-11-22 10:07:27.095 186985 DEBUG nova.compute.manager [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing instance network info cache due to event network-changed-b4bd60c8-946f-4124-b413-02ee57a5b597. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:07:27 compute-0 nova_compute[186981]: 2025-11-22 10:07:27.095 186985 DEBUG oslo_concurrency.lockutils [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:07:27 compute-0 nova_compute[186981]: 2025-11-22 10:07:27.095 186985 DEBUG oslo_concurrency.lockutils [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:07:27 compute-0 nova_compute[186981]: 2025-11-22 10:07:27.096 186985 DEBUG nova.network.neutron [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing network info cache for port b4bd60c8-946f-4124-b413-02ee57a5b597 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:07:28 compute-0 nova_compute[186981]: 2025-11-22 10:07:28.569 186985 DEBUG nova.network.neutron [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updated VIF entry in instance network info cache for port b4bd60c8-946f-4124-b413-02ee57a5b597. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:07:28 compute-0 nova_compute[186981]: 2025-11-22 10:07:28.570 186985 DEBUG nova.network.neutron [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:07:28 compute-0 nova_compute[186981]: 2025-11-22 10:07:28.586 186985 DEBUG oslo_concurrency.lockutils [req-aedeb2de-ed7a-4e0b-8e18-756a85156a73 req-34b4626b-68f1-4b50-81ec-c2c67e0ab61a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:07:30 compute-0 nova_compute[186981]: 2025-11-22 10:07:30.378 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:30 compute-0 nova_compute[186981]: 2025-11-22 10:07:30.783 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:32 compute-0 podman[216152]: 2025-11-22 10:07:32.600787579 +0000 UTC m=+0.056351325 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 22 10:07:32 compute-0 podman[216153]: 2025-11-22 10:07:32.657268334 +0000 UTC m=+0.111187375 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:07:35 compute-0 nova_compute[186981]: 2025-11-22 10:07:35.380 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:35 compute-0 nova_compute[186981]: 2025-11-22 10:07:35.791 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:36 compute-0 ovn_controller[95329]: 2025-11-22T10:07:36Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:6e:90 10.100.0.7
Nov 22 10:07:36 compute-0 ovn_controller[95329]: 2025-11-22T10:07:36Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:6e:90 10.100.0.7
Nov 22 10:07:38 compute-0 podman[216210]: 2025-11-22 10:07:38.596990329 +0000 UTC m=+0.053166877 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:07:38 compute-0 podman[216211]: 2025-11-22 10:07:38.621089925 +0000 UTC m=+0.078067294 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public)
Nov 22 10:07:40 compute-0 nova_compute[186981]: 2025-11-22 10:07:40.416 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:40 compute-0 nova_compute[186981]: 2025-11-22 10:07:40.793 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:41 compute-0 nova_compute[186981]: 2025-11-22 10:07:41.614 186985 INFO nova.compute.manager [None req-26ece0b0-f989-4070-82ee-08548ef9d1b2 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Get console output
Nov 22 10:07:41 compute-0 nova_compute[186981]: 2025-11-22 10:07:41.620 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:07:41 compute-0 podman[216249]: 2025-11-22 10:07:41.62470114 +0000 UTC m=+0.072443151 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 10:07:41 compute-0 podman[216248]: 2025-11-22 10:07:41.629457139 +0000 UTC m=+0.087431319 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:07:45 compute-0 nova_compute[186981]: 2025-11-22 10:07:45.419 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:45 compute-0 nova_compute[186981]: 2025-11-22 10:07:45.794 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:48 compute-0 nova_compute[186981]: 2025-11-22 10:07:48.454 186985 DEBUG oslo_concurrency.lockutils [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "interface-d5460be9-d4a4-45e1-8bd1-99144801279c-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:48 compute-0 nova_compute[186981]: 2025-11-22 10:07:48.454 186985 DEBUG oslo_concurrency.lockutils [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-d5460be9-d4a4-45e1-8bd1-99144801279c-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:48 compute-0 nova_compute[186981]: 2025-11-22 10:07:48.454 186985 DEBUG nova.objects.instance [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'flavor' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:07:48 compute-0 nova_compute[186981]: 2025-11-22 10:07:48.802 186985 DEBUG nova.objects.instance [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_requests' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:07:48 compute-0 nova_compute[186981]: 2025-11-22 10:07:48.817 186985 DEBUG nova.network.neutron [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:07:48 compute-0 nova_compute[186981]: 2025-11-22 10:07:48.985 186985 DEBUG nova.policy [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:07:49 compute-0 nova_compute[186981]: 2025-11-22 10:07:49.827 186985 DEBUG nova.network.neutron [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Successfully created port: 47b46c17-414f-45b6-b0f7-72fc46a774d5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.218 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:50.219 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:07:50 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:50.220 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.420 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.710 186985 DEBUG nova.network.neutron [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Successfully updated port: 47b46c17-414f-45b6-b0f7-72fc46a774d5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.726 186985 DEBUG oslo_concurrency.lockutils [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.726 186985 DEBUG oslo_concurrency.lockutils [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.726 186985 DEBUG nova.network.neutron [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.796 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.813 186985 DEBUG nova.compute.manager [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-changed-47b46c17-414f-45b6-b0f7-72fc46a774d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.814 186985 DEBUG nova.compute.manager [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing instance network info cache due to event network-changed-47b46c17-414f-45b6-b0f7-72fc46a774d5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:07:50 compute-0 nova_compute[186981]: 2025-11-22 10:07:50.814 186985 DEBUG oslo_concurrency.lockutils [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.441 186985 DEBUG nova.network.neutron [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.477 186985 DEBUG oslo_concurrency.lockutils [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.478 186985 DEBUG oslo_concurrency.lockutils [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.479 186985 DEBUG nova.network.neutron [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing network info cache for port 47b46c17-414f-45b6-b0f7-72fc46a774d5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.483 186985 DEBUG nova.virt.libvirt.vif [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:07:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:07:23Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.484 186985 DEBUG nova.network.os_vif_util [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.485 186985 DEBUG nova.network.os_vif_util [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.486 186985 DEBUG os_vif [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.487 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.488 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.488 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.491 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.491 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47b46c17-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.492 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47b46c17-41, col_values=(('external_ids', {'iface-id': '47b46c17-414f-45b6-b0f7-72fc46a774d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:e5:b6', 'vm-uuid': 'd5460be9-d4a4-45e1-8bd1-99144801279c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.495 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 NetworkManager[55425]: <info>  [1763806072.4959] manager: (tap47b46c17-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.498 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.501 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.502 186985 INFO os_vif [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41')
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.504 186985 DEBUG nova.virt.libvirt.vif [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:07:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:07:23Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.504 186985 DEBUG nova.network.os_vif_util [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.505 186985 DEBUG nova.network.os_vif_util [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.508 186985 DEBUG nova.virt.libvirt.guest [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] attach device xml: <interface type="ethernet">
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <mac address="fa:16:3e:cb:e5:b6"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <model type="virtio"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <mtu size="1442"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <target dev="tap47b46c17-41"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]: </interface>
Nov 22 10:07:52 compute-0 nova_compute[186981]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 22 10:07:52 compute-0 kernel: tap47b46c17-41: entered promiscuous mode
Nov 22 10:07:52 compute-0 NetworkManager[55425]: <info>  [1763806072.5243] manager: (tap47b46c17-41): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.524 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 ovn_controller[95329]: 2025-11-22T10:07:52Z|00096|binding|INFO|Claiming lport 47b46c17-414f-45b6-b0f7-72fc46a774d5 for this chassis.
Nov 22 10:07:52 compute-0 ovn_controller[95329]: 2025-11-22T10:07:52Z|00097|binding|INFO|47b46c17-414f-45b6-b0f7-72fc46a774d5: Claiming fa:16:3e:cb:e5:b6 10.100.0.24
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.527 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.561 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 systemd-udevd[216298]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:07:52 compute-0 ovn_controller[95329]: 2025-11-22T10:07:52Z|00098|binding|INFO|Setting lport 47b46c17-414f-45b6-b0f7-72fc46a774d5 ovn-installed in OVS
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.564 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 ovn_controller[95329]: 2025-11-22T10:07:52Z|00099|binding|INFO|Setting lport 47b46c17-414f-45b6-b0f7-72fc46a774d5 up in Southbound
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.570 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:e5:b6 10.100.0.24'], port_security=['fa:16:3e:cb:e5:b6 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27c5a67c-dc4c-4d67-b4f1-e6a36c0e1eec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302b0ac8-02d4-44fd-b1fa-ce5720e457ac, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=47b46c17-414f-45b6-b0f7-72fc46a774d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.572 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 47b46c17-414f-45b6-b0f7-72fc46a774d5 in datapath c442911c-33e7-4086-a8a7-29e86a0c5c15 bound to our chassis
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.573 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c442911c-33e7-4086-a8a7-29e86a0c5c15
Nov 22 10:07:52 compute-0 NetworkManager[55425]: <info>  [1763806072.5773] device (tap47b46c17-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:07:52 compute-0 NetworkManager[55425]: <info>  [1763806072.5783] device (tap47b46c17-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.586 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[a92fd8b5-e77c-4364-9838-bc22fef6444a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.588 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc442911c-31 in ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.590 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc442911c-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.591 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[9bcf6c8d-7258-4f42-87cf-9ff83b66cc9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.592 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[044061df-b93a-4dae-843a-8c5512d3e23b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.606 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e6d890-fa52-482e-8a70-ffa2149cabae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.631 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[07659552-5f21-4098-b973-c25f92845d55]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.662 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[fa41422a-9265-4c3b-86b0-b210d170d6a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.667 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[b02471ae-e661-4303-a91c-2962cb930003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 NetworkManager[55425]: <info>  [1763806072.6680] manager: (tapc442911c-30): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.698 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[f7b605d0-c808-4177-8a71-0374e4b1d2b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.700 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[e80a8c94-3d13-4224-a6ae-04ba27f63cb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 NetworkManager[55425]: <info>  [1763806072.7194] device (tapc442911c-30): carrier: link connected
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.723 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd7f110-68fc-4c3b-9505-70eab0bf19da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.737 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[88e18620-0aa0-4ff8-b3d5-89afe77fa214]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc442911c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:70:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353042, 'reachable_time': 23172, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216325, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.751 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed6c53a-ecf2-4730-b014-143acb212a86]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:7040'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353042, 'tstamp': 353042}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216326, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.769 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7d0a0b-0b7a-4813-8267-52a10e835388]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc442911c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:70:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353042, 'reachable_time': 23172, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216327, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.804 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[c2eecee7-b809-4bf4-86c8-2c7fccef7d6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.849 186985 DEBUG nova.virt.libvirt.driver [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.849 186985 DEBUG nova.virt.libvirt.driver [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.849 186985 DEBUG nova.virt.libvirt.driver [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:d0:6e:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.850 186985 DEBUG nova.virt.libvirt.driver [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:cb:e5:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.867 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[337d8f37-1f58-4bf9-ad41-bb3fe4b3b31a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.869 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc442911c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.869 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.869 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc442911c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.871 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 NetworkManager[55425]: <info>  [1763806072.8720] manager: (tapc442911c-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 22 10:07:52 compute-0 kernel: tapc442911c-30: entered promiscuous mode
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.873 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.874 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc442911c-30, col_values=(('external_ids', {'iface-id': '084ee2e6-6c0b-4448-aed8-b5dfcc7529c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.875 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 ovn_controller[95329]: 2025-11-22T10:07:52Z|00100|binding|INFO|Releasing lport 084ee2e6-6c0b-4448-aed8-b5dfcc7529c2 from this chassis (sb_readonly=0)
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.885 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.886 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c442911c-33e7-4086-a8a7-29e86a0c5c15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c442911c-33e7-4086-a8a7-29e86a0c5c15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.887 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[329a1ec7-35d8-4e07-b91b-baf9fc0b70d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.888 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-c442911c-33e7-4086-a8a7-29e86a0c5c15
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/c442911c-33e7-4086-a8a7-29e86a0c5c15.pid.haproxy
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID c442911c-33e7-4086-a8a7-29e86a0c5c15
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:07:52 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:52.888 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'env', 'PROCESS_TAG=haproxy-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c442911c-33e7-4086-a8a7-29e86a0c5c15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:07:52 compute-0 nova_compute[186981]: 2025-11-22 10:07:52.943 186985 DEBUG nova.virt.libvirt.guest [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:07:52</nova:creationTime>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:07:52 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     <nova:port uuid="47b46c17-414f-45b6-b0f7-72fc46a774d5">
Nov 22 10:07:52 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Nov 22 10:07:52 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:07:52 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:07:52 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:07:52 compute-0 nova_compute[186981]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 10:07:53 compute-0 nova_compute[186981]: 2025-11-22 10:07:53.005 186985 DEBUG oslo_concurrency.lockutils [None req-1082a2a6-856d-4375-ac22-195959e535b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-d5460be9-d4a4-45e1-8bd1-99144801279c-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:53 compute-0 podman[216359]: 2025-11-22 10:07:53.276881919 +0000 UTC m=+0.071362132 container create 9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 10:07:53 compute-0 systemd[1]: Started libpod-conmon-9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135.scope.
Nov 22 10:07:53 compute-0 podman[216359]: 2025-11-22 10:07:53.242555435 +0000 UTC m=+0.037035658 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:07:53 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:07:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5708ac4736a24e3c32eac78d634c6f6840f0220d78b4b0e2357b79c87b8e5f4f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:07:53 compute-0 podman[216359]: 2025-11-22 10:07:53.357687766 +0000 UTC m=+0.152167969 container init 9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 10:07:53 compute-0 podman[216359]: 2025-11-22 10:07:53.363989067 +0000 UTC m=+0.158469260 container start 9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:07:53 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [NOTICE]   (216393) : New worker (216402) forked
Nov 22 10:07:53 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [NOTICE]   (216393) : Loading success.
Nov 22 10:07:53 compute-0 podman[216372]: 2025-11-22 10:07:53.401759705 +0000 UTC m=+0.087261184 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:07:54 compute-0 nova_compute[186981]: 2025-11-22 10:07:54.812 186985 DEBUG nova.compute.manager [req-feaaace6-f4c9-4a1b-99cd-e7f077efda2a req-8b3598cd-9c81-4126-83ad-fee32a3e40b5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:07:54 compute-0 nova_compute[186981]: 2025-11-22 10:07:54.812 186985 DEBUG oslo_concurrency.lockutils [req-feaaace6-f4c9-4a1b-99cd-e7f077efda2a req-8b3598cd-9c81-4126-83ad-fee32a3e40b5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:54 compute-0 nova_compute[186981]: 2025-11-22 10:07:54.812 186985 DEBUG oslo_concurrency.lockutils [req-feaaace6-f4c9-4a1b-99cd-e7f077efda2a req-8b3598cd-9c81-4126-83ad-fee32a3e40b5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:54 compute-0 nova_compute[186981]: 2025-11-22 10:07:54.813 186985 DEBUG oslo_concurrency.lockutils [req-feaaace6-f4c9-4a1b-99cd-e7f077efda2a req-8b3598cd-9c81-4126-83ad-fee32a3e40b5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:54 compute-0 nova_compute[186981]: 2025-11-22 10:07:54.813 186985 DEBUG nova.compute.manager [req-feaaace6-f4c9-4a1b-99cd-e7f077efda2a req-8b3598cd-9c81-4126-83ad-fee32a3e40b5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] No waiting events found dispatching network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:07:54 compute-0 nova_compute[186981]: 2025-11-22 10:07:54.813 186985 WARNING nova.compute.manager [req-feaaace6-f4c9-4a1b-99cd-e7f077efda2a req-8b3598cd-9c81-4126-83ad-fee32a3e40b5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received unexpected event network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 for instance with vm_state active and task_state None.
Nov 22 10:07:55 compute-0 nova_compute[186981]: 2025-11-22 10:07:55.464 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:55 compute-0 ovn_controller[95329]: 2025-11-22T10:07:55Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:e5:b6 10.100.0.24
Nov 22 10:07:55 compute-0 ovn_controller[95329]: 2025-11-22T10:07:55Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:e5:b6 10.100.0.24
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.686 186985 DEBUG nova.network.neutron [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updated VIF entry in instance network info cache for port 47b46c17-414f-45b6-b0f7-72fc46a774d5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.687 186985 DEBUG nova.network.neutron [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.712 186985 DEBUG oslo_concurrency.lockutils [req-b4f8dba3-89a5-47f2-a8e5-84f1c04d2574 req-6a1b9e4f-eeea-4822-a9f6-253a8a4f76bf 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.895 186985 DEBUG nova.compute.manager [req-1acf40ca-7f39-417b-9372-2df5ef6eafd3 req-d70a51b1-6ddb-4ad3-98e9-2dba8e730a07 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.895 186985 DEBUG oslo_concurrency.lockutils [req-1acf40ca-7f39-417b-9372-2df5ef6eafd3 req-d70a51b1-6ddb-4ad3-98e9-2dba8e730a07 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.896 186985 DEBUG oslo_concurrency.lockutils [req-1acf40ca-7f39-417b-9372-2df5ef6eafd3 req-d70a51b1-6ddb-4ad3-98e9-2dba8e730a07 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.896 186985 DEBUG oslo_concurrency.lockutils [req-1acf40ca-7f39-417b-9372-2df5ef6eafd3 req-d70a51b1-6ddb-4ad3-98e9-2dba8e730a07 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.896 186985 DEBUG nova.compute.manager [req-1acf40ca-7f39-417b-9372-2df5ef6eafd3 req-d70a51b1-6ddb-4ad3-98e9-2dba8e730a07 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] No waiting events found dispatching network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:07:56 compute-0 nova_compute[186981]: 2025-11-22 10:07:56.897 186985 WARNING nova.compute.manager [req-1acf40ca-7f39-417b-9372-2df5ef6eafd3 req-d70a51b1-6ddb-4ad3-98e9-2dba8e730a07 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received unexpected event network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 for instance with vm_state active and task_state None.
Nov 22 10:07:57 compute-0 nova_compute[186981]: 2025-11-22 10:07:57.495 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:07:58 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:07:58.227 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:08:00 compute-0 nova_compute[186981]: 2025-11-22 10:08:00.467 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.253 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.254 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.269 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.364 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.365 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.374 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.374 186985 INFO nova.compute.claims [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.492 186985 DEBUG nova.compute.provider_tree [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.497 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.506 186985 DEBUG nova.scheduler.client.report [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.530 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.530 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.583 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.583 186985 DEBUG nova.network.neutron [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.603 186985 INFO nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.618 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.709 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.711 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.711 186985 INFO nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Creating image(s)
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.712 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.713 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.715 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.738 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.831 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.833 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.834 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.860 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.930 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:02 compute-0 nova_compute[186981]: 2025-11-22 10:08:02.932 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.175 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk 1073741824" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.176 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.177 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.243 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.244 186985 DEBUG nova.virt.disk.api [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.245 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.297 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.298 186985 DEBUG nova.virt.disk.api [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.298 186985 DEBUG nova.objects.instance [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid b0411876-4519-4bcb-a325-000d02d8b59d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.337 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.337 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Ensure instance console log exists: /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.338 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.338 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.339 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:03 compute-0 podman[216426]: 2025-11-22 10:08:03.61643996 +0000 UTC m=+0.069445590 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:08:03 compute-0 podman[216427]: 2025-11-22 10:08:03.629811184 +0000 UTC m=+0.082487835 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 10:08:03 compute-0 nova_compute[186981]: 2025-11-22 10:08:03.726 186985 DEBUG nova.policy [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:08:05 compute-0 nova_compute[186981]: 2025-11-22 10:08:05.470 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:06 compute-0 nova_compute[186981]: 2025-11-22 10:08:06.771 186985 DEBUG nova.network.neutron [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Successfully created port: 41558171-90e6-4dc7-9cc3-1edd109bc81a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.502 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.859 186985 DEBUG nova.network.neutron [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Successfully updated port: 41558171-90e6-4dc7-9cc3-1edd109bc81a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.875 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-b0411876-4519-4bcb-a325-000d02d8b59d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.875 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-b0411876-4519-4bcb-a325-000d02d8b59d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.876 186985 DEBUG nova.network.neutron [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.948 186985 DEBUG nova.compute.manager [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received event network-changed-41558171-90e6-4dc7-9cc3-1edd109bc81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.949 186985 DEBUG nova.compute.manager [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Refreshing instance network info cache due to event network-changed-41558171-90e6-4dc7-9cc3-1edd109bc81a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:08:07 compute-0 nova_compute[186981]: 2025-11-22 10:08:07.949 186985 DEBUG oslo_concurrency.lockutils [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-b0411876-4519-4bcb-a325-000d02d8b59d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.026 186985 DEBUG nova.network.neutron [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.959 186985 DEBUG nova.network.neutron [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Updating instance_info_cache with network_info: [{"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.980 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-b0411876-4519-4bcb-a325-000d02d8b59d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.981 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Instance network_info: |[{"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.982 186985 DEBUG oslo_concurrency.lockutils [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-b0411876-4519-4bcb-a325-000d02d8b59d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.983 186985 DEBUG nova.network.neutron [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Refreshing network info cache for port 41558171-90e6-4dc7-9cc3-1edd109bc81a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.988 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Start _get_guest_xml network_info=[{"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:08:08 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.994 186985 WARNING nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:08.999 186985 DEBUG nova.virt.libvirt.host [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.000 186985 DEBUG nova.virt.libvirt.host [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.003 186985 DEBUG nova.virt.libvirt.host [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.004 186985 DEBUG nova.virt.libvirt.host [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.004 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.004 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.005 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.005 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.006 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.006 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.006 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.006 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.007 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.007 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.007 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.008 186985 DEBUG nova.virt.hardware [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.011 186985 DEBUG nova.virt.libvirt.vif [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:08:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1810802209',display_name='tempest-TestNetworkBasicOps-server-1810802209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1810802209',id=7,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAbbXXsIxD9wxsfv66fFL+lSheLjpBiFb5vj3NTwETNWQ8ZD8FwpAhGjO41WyXXNuX9cxL/oHwQYwznwPPVtjIv+ZMh0inUp0Q00EufkgXKzcCZ/Bjs7PX3eoTREcBk1jw==',key_name='tempest-TestNetworkBasicOps-1595292033',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-pokmlw76',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:08:02Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=b0411876-4519-4bcb-a325-000d02d8b59d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.012 186985 DEBUG nova.network.os_vif_util [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.013 186985 DEBUG nova.network.os_vif_util [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:32:cc,bridge_name='br-int',has_traffic_filtering=True,id=41558171-90e6-4dc7-9cc3-1edd109bc81a,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41558171-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.014 186985 DEBUG nova.objects.instance [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid b0411876-4519-4bcb-a325-000d02d8b59d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.030 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <uuid>b0411876-4519-4bcb-a325-000d02d8b59d</uuid>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <name>instance-00000007</name>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-1810802209</nova:name>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:08:08</nova:creationTime>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         <nova:port uuid="41558171-90e6-4dc7-9cc3-1edd109bc81a">
Nov 22 10:08:09 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <system>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <entry name="serial">b0411876-4519-4bcb-a325-000d02d8b59d</entry>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <entry name="uuid">b0411876-4519-4bcb-a325-000d02d8b59d</entry>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </system>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <os>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   </os>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <features>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   </features>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk.config"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:8b:32:cc"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <target dev="tap41558171-90"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/console.log" append="off"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <video>
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </video>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:08:09 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:08:09 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:08:09 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:08:09 compute-0 nova_compute[186981]: </domain>
Nov 22 10:08:09 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.031 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Preparing to wait for external event network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.032 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.033 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.033 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.034 186985 DEBUG nova.virt.libvirt.vif [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:08:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1810802209',display_name='tempest-TestNetworkBasicOps-server-1810802209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1810802209',id=7,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAbbXXsIxD9wxsfv66fFL+lSheLjpBiFb5vj3NTwETNWQ8ZD8FwpAhGjO41WyXXNuX9cxL/oHwQYwznwPPVtjIv+ZMh0inUp0Q00EufkgXKzcCZ/Bjs7PX3eoTREcBk1jw==',key_name='tempest-TestNetworkBasicOps-1595292033',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-pokmlw76',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:08:02Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=b0411876-4519-4bcb-a325-000d02d8b59d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.035 186985 DEBUG nova.network.os_vif_util [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.036 186985 DEBUG nova.network.os_vif_util [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:32:cc,bridge_name='br-int',has_traffic_filtering=True,id=41558171-90e6-4dc7-9cc3-1edd109bc81a,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41558171-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.037 186985 DEBUG os_vif [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:32:cc,bridge_name='br-int',has_traffic_filtering=True,id=41558171-90e6-4dc7-9cc3-1edd109bc81a,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41558171-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.039 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.039 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.040 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.044 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.045 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41558171-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.046 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap41558171-90, col_values=(('external_ids', {'iface-id': '41558171-90e6-4dc7-9cc3-1edd109bc81a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:32:cc', 'vm-uuid': 'b0411876-4519-4bcb-a325-000d02d8b59d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.081 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:09 compute-0 NetworkManager[55425]: <info>  [1763806089.0836] manager: (tap41558171-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.086 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.089 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.091 186985 INFO os_vif [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:32:cc,bridge_name='br-int',has_traffic_filtering=True,id=41558171-90e6-4dc7-9cc3-1edd109bc81a,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41558171-90')
Nov 22 10:08:09 compute-0 podman[216474]: 2025-11-22 10:08:09.201299114 +0000 UTC m=+0.066320623 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 10:08:09 compute-0 podman[216473]: 2025-11-22 10:08:09.204647965 +0000 UTC m=+0.068005168 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.205 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.206 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.206 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:8b:32:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.207 186985 INFO nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Using config drive
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.837 186985 INFO nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Creating config drive at /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk.config
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.846 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpln9ch0qm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:09 compute-0 nova_compute[186981]: 2025-11-22 10:08:09.986 186985 DEBUG oslo_concurrency.processutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpln9ch0qm" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:10 compute-0 NetworkManager[55425]: <info>  [1763806090.0734] manager: (tap41558171-90): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 22 10:08:10 compute-0 kernel: tap41558171-90: entered promiscuous mode
Nov 22 10:08:10 compute-0 ovn_controller[95329]: 2025-11-22T10:08:10Z|00101|binding|INFO|Claiming lport 41558171-90e6-4dc7-9cc3-1edd109bc81a for this chassis.
Nov 22 10:08:10 compute-0 ovn_controller[95329]: 2025-11-22T10:08:10Z|00102|binding|INFO|41558171-90e6-4dc7-9cc3-1edd109bc81a: Claiming fa:16:3e:8b:32:cc 10.100.0.30
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.077 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.085 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:32:cc 10.100.0.30'], port_security=['fa:16:3e:8b:32:cc 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '399685b7-1818-4f78-a311-c5ac964c48b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302b0ac8-02d4-44fd-b1fa-ce5720e457ac, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=41558171-90e6-4dc7-9cc3-1edd109bc81a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.085 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 41558171-90e6-4dc7-9cc3-1edd109bc81a in datapath c442911c-33e7-4086-a8a7-29e86a0c5c15 bound to our chassis
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.086 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c442911c-33e7-4086-a8a7-29e86a0c5c15
Nov 22 10:08:10 compute-0 ovn_controller[95329]: 2025-11-22T10:08:10Z|00103|binding|INFO|Setting lport 41558171-90e6-4dc7-9cc3-1edd109bc81a ovn-installed in OVS
Nov 22 10:08:10 compute-0 ovn_controller[95329]: 2025-11-22T10:08:10Z|00104|binding|INFO|Setting lport 41558171-90e6-4dc7-9cc3-1edd109bc81a up in Southbound
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.104 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.107 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[6c49dde1-88c6-4da5-8dce-990f9459873b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.109 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:10 compute-0 systemd-udevd[216527]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:08:10 compute-0 systemd-machined[153303]: New machine qemu-7-instance-00000007.
Nov 22 10:08:10 compute-0 NetworkManager[55425]: <info>  [1763806090.1368] device (tap41558171-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:08:10 compute-0 NetworkManager[55425]: <info>  [1763806090.1379] device (tap41558171-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.144 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[8da5ce97-45d1-47ba-81e0-702a818eb92a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.146 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[d21121f2-d85b-45de-829d-4e4fd409e43a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:08:10 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.167 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0547ba-926a-4530-ad16-17bfa7533761]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.182 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca7c3ea-3a10-4612-8573-90a17ad73940]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc442911c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:70:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353042, 'reachable_time': 23172, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216536, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.198 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[a373b3c9-bd2b-41a2-9a96-5ede8b082d73]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc442911c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353054, 'tstamp': 353054}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216537, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapc442911c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353056, 'tstamp': 353056}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216537, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.200 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc442911c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.234 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.235 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.235 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc442911c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.235 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.236 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc442911c-30, col_values=(('external_ids', {'iface-id': '084ee2e6-6c0b-4448-aed8-b5dfcc7529c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:08:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:10.236 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.342 186985 DEBUG nova.compute.manager [req-567b22bd-6564-4b91-9b46-6d2fae190724 req-650b7c39-252f-4516-9640-19bcedd3cba9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received event network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.343 186985 DEBUG oslo_concurrency.lockutils [req-567b22bd-6564-4b91-9b46-6d2fae190724 req-650b7c39-252f-4516-9640-19bcedd3cba9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.343 186985 DEBUG oslo_concurrency.lockutils [req-567b22bd-6564-4b91-9b46-6d2fae190724 req-650b7c39-252f-4516-9640-19bcedd3cba9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.343 186985 DEBUG oslo_concurrency.lockutils [req-567b22bd-6564-4b91-9b46-6d2fae190724 req-650b7c39-252f-4516-9640-19bcedd3cba9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.344 186985 DEBUG nova.compute.manager [req-567b22bd-6564-4b91-9b46-6d2fae190724 req-650b7c39-252f-4516-9640-19bcedd3cba9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Processing event network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.472 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.560 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806090.5595155, b0411876-4519-4bcb-a325-000d02d8b59d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.560 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] VM Started (Lifecycle Event)
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.562 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.566 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.569 186985 INFO nova.virt.libvirt.driver [-] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Instance spawned successfully.
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.569 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.577 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.579 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.587 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.588 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.588 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.588 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.589 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.589 186985 DEBUG nova.virt.libvirt.driver [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.595 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.595 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806090.561965, b0411876-4519-4bcb-a325-000d02d8b59d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.596 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] VM Paused (Lifecycle Event)
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.626 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.630 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806090.5652683, b0411876-4519-4bcb-a325-000d02d8b59d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.630 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] VM Resumed (Lifecycle Event)
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.657 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.660 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.674 186985 INFO nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Took 7.96 seconds to spawn the instance on the hypervisor.
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.675 186985 DEBUG nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.687 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.690 186985 DEBUG nova.network.neutron [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Updated VIF entry in instance network info cache for port 41558171-90e6-4dc7-9cc3-1edd109bc81a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.690 186985 DEBUG nova.network.neutron [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Updating instance_info_cache with network_info: [{"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.722 186985 DEBUG oslo_concurrency.lockutils [req-b1c15da8-197c-4dc8-93cd-9c7e03a58cd2 req-b300f247-24cd-4659-ad2c-50429d28e134 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-b0411876-4519-4bcb-a325-000d02d8b59d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.748 186985 INFO nova.compute.manager [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Took 8.42 seconds to build instance.
Nov 22 10:08:10 compute-0 nova_compute[186981]: 2025-11-22 10:08:10.763 186985 DEBUG oslo_concurrency.lockutils [None req-4421475f-b1d8-466b-8847-4c0681bd1646 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.440 186985 DEBUG nova.compute.manager [req-33bf9d80-f29b-4141-b634-5937aafb72d8 req-76a29141-88f5-4fb3-9818-6a3a8e897529 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received event network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.441 186985 DEBUG oslo_concurrency.lockutils [req-33bf9d80-f29b-4141-b634-5937aafb72d8 req-76a29141-88f5-4fb3-9818-6a3a8e897529 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.441 186985 DEBUG oslo_concurrency.lockutils [req-33bf9d80-f29b-4141-b634-5937aafb72d8 req-76a29141-88f5-4fb3-9818-6a3a8e897529 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.441 186985 DEBUG oslo_concurrency.lockutils [req-33bf9d80-f29b-4141-b634-5937aafb72d8 req-76a29141-88f5-4fb3-9818-6a3a8e897529 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.441 186985 DEBUG nova.compute.manager [req-33bf9d80-f29b-4141-b634-5937aafb72d8 req-76a29141-88f5-4fb3-9818-6a3a8e897529 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] No waiting events found dispatching network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.442 186985 WARNING nova.compute.manager [req-33bf9d80-f29b-4141-b634-5937aafb72d8 req-76a29141-88f5-4fb3-9818-6a3a8e897529 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received unexpected event network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a for instance with vm_state active and task_state None.
Nov 22 10:08:12 compute-0 podman[216553]: 2025-11-22 10:08:12.607470933 +0000 UTC m=+0.061016888 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:08:12 compute-0 podman[216552]: 2025-11-22 10:08:12.607513674 +0000 UTC m=+0.061047238 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.612 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.613 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.613 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.805 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.805 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.806 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 10:08:12 compute-0 nova_compute[186981]: 2025-11-22 10:08:12.806 186985 DEBUG nova.objects.instance [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:08:14 compute-0 nova_compute[186981]: 2025-11-22 10:08:14.084 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.474 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.931 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.966 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.966 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.967 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.967 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.967 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.968 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.996 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.996 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.996 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:15 compute-0 nova_compute[186981]: 2025-11-22 10:08:15.996 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.098 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.166 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.167 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.219 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.224 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.278 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.279 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.329 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.482 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.483 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5433MB free_disk=73.42858123779297GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.484 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.484 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.593 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance d5460be9-d4a4-45e1-8bd1-99144801279c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.594 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance b0411876-4519-4bcb-a325-000d02d8b59d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.594 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.594 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.622 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing inventories for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.652 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating ProviderTree inventory for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.652 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.670 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing aggregate associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.700 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing trait associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE2,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE4A,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.760 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.775 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.798 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:08:16 compute-0 nova_compute[186981]: 2025-11-22 10:08:16.799 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.840 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'name': 'tempest-TestNetworkBasicOps-server-1810802209', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'user_id': 'fd88a700663e44618f0a22f234573806', 'hostId': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.843 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'name': 'tempest-TestNetworkBasicOps-server-1174788018', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'user_id': 'fd88a700663e44618f0a22f234573806', 'hostId': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.843 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.859 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.860 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.872 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.873 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e4eb7b5-ac98-42ab-89f5-af0099244120', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:16.843781', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a10a3e6-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.613267973, 'message_signature': '45c37dbbebe3367eea0fb1951320c476535c95f4b1e49e50fca874828849a289'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:16.843781', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a10b0c0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.613267973, 'message_signature': 'd6c046b72bb1c95e9e51ac92bd63eabffff3f238d0703a8cdb99a8291bb5111c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:16.843781', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a12a1d2-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.62997666, 'message_signature': 'ac6809547757ee790fa15f8d2f0f7251eb490085d738860f6de270c642eaf483'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:16.843781', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a12af06-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.62997666, 'message_signature': '82c8b92cc8550d02aa4094669c901372932be7ac95938d64f260d612b4d24d08'}]}, 'timestamp': '2025-11-22 10:08:16.873625', '_unique_id': '1229fb9e05274955a9b28a5f4e94ee71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.874 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.875 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.876 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.876 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.876 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e21271b2-510e-4812-ae21-cd1f5e40558c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:16.875943', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a13168a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.613267973, 'message_signature': '82d8cef7405bcc246165ff1a8c0f3e7e33b26a01fb6d85ba0ee3313c2f67f3b4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:16.875943', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a132170-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.613267973, 'message_signature': '0f746f726ba18966aaa72b730764ea136a7a11ae5aabdb89a207113aee57195f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:16.875943', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a132c9c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.62997666, 'message_signature': '91b6730b35eca98df0dabdd271804db32917097f7f320a148e5a42dc9e92ee3a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:16.875943', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a1336ce-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.62997666, 'message_signature': '89d617ddbb79d81ee4804b9a487361396a5f2ac4ecf4559304cd779de08f46e1'}]}, 'timestamp': '2025-11-22 10:08:16.877046', '_unique_id': 'f45dc01a14f142ecbf2e9d7f67f16d4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.878 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.880 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b0411876-4519-4bcb-a325-000d02d8b59d / tap41558171-90 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.881 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.883 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d5460be9-d4a4-45e1-8bd1-99144801279c / tapb4bd60c8-94 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.884 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d5460be9-d4a4-45e1-8bd1-99144801279c / tap47b46c17-41 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.884 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.bytes volume: 28514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.884 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.bytes volume: 1858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0be04850-db89-4036-acf6-4744356f0a45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:16.878477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a13e6a0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': '0e1006628abe3e85fe775280b94dac6a792700d001271200d585071d4c5434f1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28514, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:16.878477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a1461e8-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '08baa80fa178dbb3617d509ae804f06814f8d4b91167d856b17cb752c2005c22'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1858, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:16.878477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a146b02-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '2eb060a6f45200b7ad788b526377ed27f8bc82207dbbfa4f6d6c017b363c9724'}]}, 'timestamp': '2025-11-22 10:08:16.884945', '_unique_id': 'd5d179dfd0564695818a6bd23f274135'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.885 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.911 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.911 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.936 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.write.requests volume: 319 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.936 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7982e720-deca-4289-909f-8347565117fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:16.886648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a188868-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '309fe846a183d3a727a7f9cf46eb6d9395bca63afd27d32b04a243dba12acdc7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:16.886648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a189330-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': 'fe40d0e6bf77edac90897a32da33c03ae29f414f6d6c9fb46f60fde4b946538b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 319, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:16.886648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a1c51fa-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '4bd9e1fa08e268b2f52661a910dffbe5aaf30e893991eb64762bdfd46736f2bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:16.886648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a1c5e70-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '33a113b92a3493e77daf7e944b464b3f73646b771bef3497638f91bb09d4b0b4'}]}, 'timestamp': '2025-11-22 10:08:16.937038', '_unique_id': '2c6daa9971b9459cb4ded2a5457cbcb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.938 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.939 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.bytes volume: 34027 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.939 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.bytes volume: 1366 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40b3656f-72a2-4ab5-b413-a4857f44e359', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:16.938823', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a1caf6a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': 'b2446a357559c7d65af80ef840e088d077a34522ec5b053730dcea149683e796'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 34027, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:16.938823', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a1cb82a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'd0dcef379a93096828513907a06081754630633dc4d3bd780292d20c0fb12013'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1366, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:16.938823', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a1cc19e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'b8f934500902ae6bbd01db8bb73713566541c0e1036628452388f818d4a28878'}]}, 'timestamp': '2025-11-22 10:08:16.939587', '_unique_id': '67469d4557b14125b4ad26d10a2e9c60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.940 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.941 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.941 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.write.bytes volume: 73080832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.941 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1719b611-648e-4794-90ee-02f9bf841146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:16.940818', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a1cfc2c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '372d2c58e92e5212c282c08884bc1221f295cbe65174d25e0bab21648d2794d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:16.940818', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a1d042e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '04fa11964d20fb82fb3b548bb2dcb5df2933ef1b31fbd73c7c7bd444f1f17c56'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73080832, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:16.940818', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a1d0b7c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': 'c65938238de0d829b6970e388a63b3c4a87229a6061bd810a61941e70d721059'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:16.940818', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a1d1310-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '7d1d3de0ffa3d208c31a65401c7371bfa45ff1c3ccf95cd8f198537954580b50'}]}, 'timestamp': '2025-11-22 10:08:16.941654', '_unique_id': '02c9678a3ad649b38599adbf04c77f35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.942 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06ee7001-87c9-46c7-9e7f-b677b36179d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:16.942854', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a1d4b00-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': '9380d7757c5165053b9573295127d57de39c09e5f46726034b8215bad8c377bd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:16.942854', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a1d5424-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'eca2f67d6eb2e4f903291407ee35b4106594eb9e0864b744865bbcb1a8898eec'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:16.942854', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a1d5c1c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '1b0655d94d736222be4d749018ccfa399af3f27617d8c59813a3b0de3d3e946e'}]}, 'timestamp': '2025-11-22 10:08:16.943513', '_unique_id': '17f0a668ec5244c3b97d1d6fa75d3237'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.944 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.967 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.967 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b0411876-4519-4bcb-a325-000d02d8b59d: ceilometer.compute.pollsters.NoVolumeException
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.983 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/memory.usage volume: 43.23828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c324c5e4-7a81-4b78-b8db-df75c39fa327', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.23828125, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'timestamp': '2025-11-22T10:08:16.944649', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2a237f16-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.752740632, 'message_signature': '1283a58ed94f0007f44e59a7e312110663fd8789dac0f06e0a8a1fbad46b3e0a'}]}, 'timestamp': '2025-11-22 10:08:16.983865', '_unique_id': 'f1eed5473dbb41f0bf9ae4fd30f77645'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.984 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.985 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.986 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/cpu volume: 6150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.986 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/cpu volume: 11350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2766d01e-d669-4e31-b627-6942bfa97d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6150000000, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'timestamp': '2025-11-22T10:08:16.986034', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2a23e3c0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.736662853, 'message_signature': 'f2fbcf7d1bf2d4407a90aff4d1eea03246cd4f5001d176a9ced1e59620d896e8'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11350000000, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'timestamp': '2025-11-22T10:08:16.986034', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2a23f28e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.752740632, 'message_signature': '0150338aa2a64fbb86a76c7bdef7cb52bea887f42260df94c66fa16a062fa9d4'}]}, 'timestamp': '2025-11-22 10:08:16.986740', '_unique_id': '64fdc1577a994ca0bda93e18fc3e666f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.987 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.988 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.988 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>]
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.989 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.989 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.990 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6241e87b-eafa-4748-ab29-3e6380ca9d23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:16.989306', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a2463f4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': '390eead15474a11395a2836e7859628586ce9e8a0a17cd0af1e38a315b744573'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:16.989306', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a2473a8-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'f0cbd616b27cd7af2c529cf9ae97da63b4ff8f43230372b3b720838af4d47168'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:16.989306', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a247f7e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '527bedc79dc1541a2b627760f5da4da99ead6709f2315ccbf1eb902fb57bee44'}]}, 'timestamp': '2025-11-22 10:08:16.990384', '_unique_id': 'b08bf33b95444a94a366fa6761578789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.991 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.992 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.read.latency volume: 447913465 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.992 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.read.latency volume: 3454074 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.993 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.read.latency volume: 573151009 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.993 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.read.latency volume: 82653084 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b62f9c42-3c72-4cce-9f42-a95b3784da4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 447913465, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:16.992490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a24e0ea-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': 'd6be51fc502c66abc9400d242da8cf2ade38c956fcbd67e7e91e38f5ffa055a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3454074, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:16.992490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a24ec98-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '149a2fb1f6082cf6455f935369318728fc5339425085eceb17f65f6f4967d355'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 573151009, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:16.992490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a24f774-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '0507422a243df43336b5abee23298bbce40426e4d3f395a9ae3da3e046783a3d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 82653084, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:16.992490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a2501ec-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': 'c7de7dc27a98a26c51592fd9e36cfacfbaca70d0946f39163b022c1785fa552c'}]}, 'timestamp': '2025-11-22 10:08:16.993695', '_unique_id': '7b0141cce6ae4b61ac397d26a1fd0ca7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.994 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.995 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.995 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>]
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.996 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.996 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.packets volume: 180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.996 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.packets volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0412230-0789-43c9-8e70-711ad2662a9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:16.996066', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a256b78-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': 'b41f6b7a48e9e34f6462a3a81839472c172f5230575211189395dd97cd313183'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 180, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:16.996066', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a2577a8-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'ae8de1f33b1342ca29b0c9e728b31a9f9e9a81a141b081fd600d072242fd2883'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 19, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:16.996066', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a258590-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'dc438af509b42a039baef4c0e8fe41e9adcccf42fe2eaf2649e3c0afef0b0f0a'}]}, 'timestamp': '2025-11-22 10:08:16.997083', '_unique_id': '9d4a2b4db57240e49a6337d87f339a75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.997 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.998 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.999 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.999 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.write.latency volume: 4125414799 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:16.999 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '532cecb0-9ca5-4ca3-b93b-e329bb6c7d4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:16.998839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a25d784-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '457c8528258c82d7a7e5a398672d8bb9b00135219c8c95ecb1167a15f6239549'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:16.998839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a25e2ce-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '3ff7edefb92bc00acd4d48d4f3d24e4cb660f8e780d02f3bcbdfcfdc3480fe47'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4125414799, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:16.998839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a25ee68-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': 'c7b3d3408a87d575c8a93c15be0c71c9bb8cd12e84cb9b35d342d0c3d96c0dcd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:16.998839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a25f930-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': 'f058771040d06da6f58b16310aacde1f4ae16dc02c242c13cdba78a9f5d0260a'}]}, 'timestamp': '2025-11-22 10:08:17.000001', '_unique_id': '50b22985157e4a49b6ce686fe87061cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.000 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.001 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.002 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.002 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.read.bytes volume: 29698560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.002 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3f6c84a-2eea-4953-b2e2-a44cf8587373', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:17.001715', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a2647dc-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '27bd1dc0afe972698a35150dc7c487e4c748189572277259c9252ede1498b178'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:17.001715', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a265344-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': '976cf9b9309ac33e240736080ecd9a6c5bd556eeed88e518925f7e2b9f0ae57c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29698560, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:17.001715', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a265e98-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '28d87e5ac133dc7d6d246784be1b00c90ee4ed9afea57892ec4ea57e3ef76891'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:17.001715', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a266b90-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '9002b01342ab2cab05bb3f6d0a11472a4ae9d67f0798df0c6024cffecde09712'}]}, 'timestamp': '2025-11-22 10:08:17.002927', '_unique_id': 'bdb9006bcf3b420fa9e3ffed676eab16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.003 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.004 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.004 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.004 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.005 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13aded8a-b86b-4dfe-a68f-dee052ed374f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:17.004610', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a26b94c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': 'bc643a1a2dd2b8921554ca8ce158b13a7724a469e954189b9f08c5929e334cd6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:17.004610', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a26c504-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'c8eba1308ac3dae835afe9d70c2c509b956620935d911a45f0ee495721d7c256'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:17.004610', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a26d0bc-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '7166bced5efe57c8daf9728f102988412b7a0ad0abba3a642b9acd5cf6a78a9e'}]}, 'timestamp': '2025-11-22 10:08:17.005529', '_unique_id': '6fb27f4f7ed1495f87c34c3f22a3bc6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.007 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.007 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.007 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e262ced-6ded-4509-a2c1-deeb5053550d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:17.007213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a271e8c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': '6d78cc08fae0241df401ef147e64798a7d9a690e4a0fb6a47fd5b25f31201faa'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:17.007213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a272bb6-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '1f89d215f9d49ccda323f34577d7e8b519c6b545421e95c6394d8ce4796a2f4d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:17.007213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a273962-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'bf928405090e9a979a93553875e51cd9ee11d449fe9cb51dcdefd5bce7990bf8'}]}, 'timestamp': '2025-11-22 10:08:17.008208', '_unique_id': '79956689ddbe412da4ff83e61224da5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.009 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.010 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.010 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.010 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9a1085-2d05-491b-9fbb-aedfc8baf0e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:17.009831', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a27857a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.613267973, 'message_signature': '78291654be9a6afcccdfe4a852f1c7dd94fcd8652a717bb3ccab3c187e2c604b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:17.009831', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a279178-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.613267973, 'message_signature': 'a90273881c50f1ac9544670d56e3dc71887513bbf0d539e1b38ffdc19a452401'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:17.009831', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a279dee-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.62997666, 'message_signature': '797851ea7df4d23ce813e09c143eddc39a0a75fd31c4edfb07f1e63a101394ec'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:17.009831', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a27a8d4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.62997666, 'message_signature': '2553c5a639422a87f12c8c7acb870ac407dac8b742d56ac1f46f51b840b40fda'}]}, 'timestamp': '2025-11-22 10:08:17.011051', '_unique_id': '8f2ecaa3a86f40e688986410d1ac2696'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.011 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.013 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.013 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>]
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.013 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.013 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.014 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83ac6a27-e66e-4b01-beb7-643c893647c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:17.013448', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a281364-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': 'ec0c08ba44875c2935b2afe0a22fb1cf04450009fdd10e84abe42e82190ffeb0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:17.013448', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a281f44-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'd21b086165906d1c34285116ba491869eaa7881916a96ec03f4dee8e21904de2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:17.013448', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a2829ee-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'fb5cf24dce11e896417370a23345441aab5d5aa70d864e238e87934d222d5b0e'}]}, 'timestamp': '2025-11-22 10:08:17.014360', '_unique_id': 'bec30f3fcc4d418da260aac50600b548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.016 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.016 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.packets volume: 177 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.016 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12427d06-dea1-4309-84cb-cea00edf041c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:17.016002', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a287868-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': '4de6953b9ead5cbfa5d552528b1b5c54c94391dfb0b3b0da8628d8faf3c034e7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 177, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:17.016002', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a2883d0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '517548218ab98027aa562037325abd02124b9cb229406b3257eb30c811e2d25b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:17.016002', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a288fa6-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': '6d304aff088637caeb14b0099c5487b205ba11a3f7d798e3c52383d2711a8b5b'}]}, 'timestamp': '2025-11-22 10:08:17.016966', '_unique_id': '114c24bfb3284299a8b382dea04f6b0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.017 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.018 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.018 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.019 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.read.requests volume: 1074 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.019 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '480e99a7-fc9c-40ba-9725-83f17b34f53d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-vda', 'timestamp': '2025-11-22T10:08:17.018668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a28de3e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': 'cca12acf55edab5deb131c2e75547a8bb4063748263750298850264c844257f9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'b0411876-4519-4bcb-a325-000d02d8b59d-sda', 'timestamp': '2025-11-22T10:08:17.018668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'instance-00000007', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a28e938-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.656138634, 'message_signature': 'a621abe0a124e1516bab77a536f822cc086a9c7d62bc50e38d1619a738c38358'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1074, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-vda', 'timestamp': '2025-11-22T10:08:17.018668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2a28f3ce-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '4250698b07ae23ecfd5136f2bdf82450c1b137225e68e739379b676cc1d1bcf5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c-sda', 'timestamp': '2025-11-22T10:08:17.018668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'instance-00000006', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2a28ff2c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.68164022, 'message_signature': '5bf848aa11cf565cc438142649a64918dc3f919a492e76bebf6070701d599080'}]}, 'timestamp': '2025-11-22 10:08:17.019812', '_unique_id': '3cc798d94533464b9910f0dd2beb1284'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.020 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.021 12 DEBUG ceilometer.compute.pollsters [-] b0411876-4519-4bcb-a325-000d02d8b59d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.021 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.022 12 DEBUG ceilometer.compute.pollsters [-] d5460be9-d4a4-45e1-8bd1-99144801279c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d6874ea-1adf-4f1c-a284-912aa8aacabb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000007-b0411876-4519-4bcb-a325-000d02d8b59d-tap41558171-90', 'timestamp': '2025-11-22T10:08:17.021391', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1810802209', 'name': 'tap41558171-90', 'instance_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:32:cc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap41558171-90'}, 'message_id': '2a294874-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.647978671, 'message_signature': '49b08048f1549a2083ec22e80641a6bd6108ae5bd5373e40c250f9d9d152604c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tapb4bd60c8-94', 'timestamp': '2025-11-22T10:08:17.021391', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tapb4bd60c8-94', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:6e:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4bd60c8-94'}, 'message_id': '2a295800-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'a55778e8ec5c5b1b4f4fad1fe1b284015f81494873b98c58e29044c81fa00898'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-00000006-d5460be9-d4a4-45e1-8bd1-99144801279c-tap47b46c17-41', 'timestamp': '2025-11-22T10:08:17.021391', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1174788018', 'name': 'tap47b46c17-41', 'instance_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cb:e5:b6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap47b46c17-41'}, 'message_id': '2a296318-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3554.651100996, 'message_signature': 'f14d7c58a879c651c75ff81b781e9ec64561a0e0be6783b409b225fb3d90cdf8'}]}, 'timestamp': '2025-11-22 10:08:17.022382', '_unique_id': '5f230acf5dce42489cbd68c34a522495'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.023 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.024 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:08:17 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:08:17.024 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1810802209>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1174788018>]
Nov 22 10:08:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:17.936 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:08:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:17.936 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:08:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:08:17.937 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:08:18 compute-0 nova_compute[186981]: 2025-11-22 10:08:18.425 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:18 compute-0 nova_compute[186981]: 2025-11-22 10:08:18.425 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:18 compute-0 nova_compute[186981]: 2025-11-22 10:08:18.426 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:08:19 compute-0 nova_compute[186981]: 2025-11-22 10:08:19.087 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:19 compute-0 nova_compute[186981]: 2025-11-22 10:08:19.590 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:20 compute-0 nova_compute[186981]: 2025-11-22 10:08:20.476 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:21 compute-0 nova_compute[186981]: 2025-11-22 10:08:21.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:08:23 compute-0 podman[216625]: 2025-11-22 10:08:23.615585213 +0000 UTC m=+0.063147105 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:08:24 compute-0 nova_compute[186981]: 2025-11-22 10:08:24.089 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:24 compute-0 ovn_controller[95329]: 2025-11-22T10:08:24Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:32:cc 10.100.0.30
Nov 22 10:08:24 compute-0 ovn_controller[95329]: 2025-11-22T10:08:24Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:32:cc 10.100.0.30
Nov 22 10:08:25 compute-0 nova_compute[186981]: 2025-11-22 10:08:25.479 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:29 compute-0 nova_compute[186981]: 2025-11-22 10:08:29.093 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:30 compute-0 nova_compute[186981]: 2025-11-22 10:08:30.482 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:33 compute-0 nova_compute[186981]: 2025-11-22 10:08:33.105 186985 DEBUG nova.compute.manager [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-changed-47b46c17-414f-45b6-b0f7-72fc46a774d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:08:33 compute-0 nova_compute[186981]: 2025-11-22 10:08:33.105 186985 DEBUG nova.compute.manager [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing instance network info cache due to event network-changed-47b46c17-414f-45b6-b0f7-72fc46a774d5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:08:33 compute-0 nova_compute[186981]: 2025-11-22 10:08:33.106 186985 DEBUG oslo_concurrency.lockutils [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:08:33 compute-0 nova_compute[186981]: 2025-11-22 10:08:33.106 186985 DEBUG oslo_concurrency.lockutils [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:08:33 compute-0 nova_compute[186981]: 2025-11-22 10:08:33.106 186985 DEBUG nova.network.neutron [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing network info cache for port 47b46c17-414f-45b6-b0f7-72fc46a774d5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:08:34 compute-0 nova_compute[186981]: 2025-11-22 10:08:34.135 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:34 compute-0 nova_compute[186981]: 2025-11-22 10:08:34.435 186985 DEBUG nova.network.neutron [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updated VIF entry in instance network info cache for port 47b46c17-414f-45b6-b0f7-72fc46a774d5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:08:34 compute-0 nova_compute[186981]: 2025-11-22 10:08:34.436 186985 DEBUG nova.network.neutron [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:08:34 compute-0 nova_compute[186981]: 2025-11-22 10:08:34.456 186985 DEBUG oslo_concurrency.lockutils [req-b4b9b741-eeb8-4719-8002-e03afa0bbf7f req-aaddedc1-6b0f-4819-a65e-e743e042bed0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:08:34 compute-0 podman[216650]: 2025-11-22 10:08:34.60937899 +0000 UTC m=+0.061741597 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 10:08:34 compute-0 podman[216651]: 2025-11-22 10:08:34.65514147 +0000 UTC m=+0.103113817 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 10:08:35 compute-0 nova_compute[186981]: 2025-11-22 10:08:35.486 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:39 compute-0 nova_compute[186981]: 2025-11-22 10:08:39.182 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:39 compute-0 podman[216697]: 2025-11-22 10:08:39.598390315 +0000 UTC m=+0.054698874 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 10:08:39 compute-0 podman[216698]: 2025-11-22 10:08:39.606419394 +0000 UTC m=+0.054751525 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter)
Nov 22 10:08:40 compute-0 nova_compute[186981]: 2025-11-22 10:08:40.487 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:43 compute-0 podman[216739]: 2025-11-22 10:08:43.611604542 +0000 UTC m=+0.068393288 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 10:08:43 compute-0 podman[216738]: 2025-11-22 10:08:43.614871951 +0000 UTC m=+0.065536280 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:08:44 compute-0 nova_compute[186981]: 2025-11-22 10:08:44.228 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:45 compute-0 nova_compute[186981]: 2025-11-22 10:08:45.490 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:49 compute-0 nova_compute[186981]: 2025-11-22 10:08:49.272 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:50 compute-0 nova_compute[186981]: 2025-11-22 10:08:50.493 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:54 compute-0 nova_compute[186981]: 2025-11-22 10:08:54.349 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:54 compute-0 podman[216781]: 2025-11-22 10:08:54.603678224 +0000 UTC m=+0.058262832 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:08:55 compute-0 nova_compute[186981]: 2025-11-22 10:08:55.510 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:08:59 compute-0 nova_compute[186981]: 2025-11-22 10:08:59.385 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.554 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.794 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.795 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.795 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.796 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.796 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.798 186985 INFO nova.compute.manager [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Terminating instance
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.799 186985 DEBUG nova.compute.manager [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:09:00 compute-0 kernel: tap41558171-90 (unregistering): left promiscuous mode
Nov 22 10:09:00 compute-0 NetworkManager[55425]: <info>  [1763806140.8527] device (tap41558171-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:09:00 compute-0 ovn_controller[95329]: 2025-11-22T10:09:00Z|00105|binding|INFO|Releasing lport 41558171-90e6-4dc7-9cc3-1edd109bc81a from this chassis (sb_readonly=0)
Nov 22 10:09:00 compute-0 ovn_controller[95329]: 2025-11-22T10:09:00Z|00106|binding|INFO|Setting lport 41558171-90e6-4dc7-9cc3-1edd109bc81a down in Southbound
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.861 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:00 compute-0 ovn_controller[95329]: 2025-11-22T10:09:00Z|00107|binding|INFO|Removing iface tap41558171-90 ovn-installed in OVS
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.863 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.871 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:32:cc 10.100.0.30'], port_security=['fa:16:3e:8b:32:cc 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': 'b0411876-4519-4bcb-a325-000d02d8b59d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '399685b7-1818-4f78-a311-c5ac964c48b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302b0ac8-02d4-44fd-b1fa-ce5720e457ac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=41558171-90e6-4dc7-9cc3-1edd109bc81a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.872 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 41558171-90e6-4dc7-9cc3-1edd109bc81a in datapath c442911c-33e7-4086-a8a7-29e86a0c5c15 unbound from our chassis
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.873 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c442911c-33e7-4086-a8a7-29e86a0c5c15
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.880 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.887 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[fc48cddb-6892-43ad-898a-4716e4f1e3bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.914 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[26b34435-cbc0-4cc3-ac29-fc4a9acdcd5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.917 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[f831e7c1-294d-4bf8-9e15-3c92ee45f721]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:00 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 22 10:09:00 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 14.504s CPU time.
Nov 22 10:09:00 compute-0 systemd-machined[153303]: Machine qemu-7-instance-00000007 terminated.
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.948 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[bc92c5bf-3d09-4a1b-861a-a3315399acb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.966 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[aa50a863-34dc-4a36-a194-ecfc07a61382]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc442911c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:70:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353042, 'reachable_time': 23172, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 7, 'inoctets': 524, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 524, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216816, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.980 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[33ddffe4-6c79-4cb8-9a42-280dbf4f08f2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc442911c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353054, 'tstamp': 353054}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216817, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapc442911c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353056, 'tstamp': 353056}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216817, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.981 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc442911c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.983 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:00 compute-0 nova_compute[186981]: 2025-11-22 10:09:00.987 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.987 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc442911c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.988 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.988 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc442911c-30, col_values=(('external_ids', {'iface-id': '084ee2e6-6c0b-4448-aed8-b5dfcc7529c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:00.988 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.061 186985 INFO nova.virt.libvirt.driver [-] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Instance destroyed successfully.
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.062 186985 DEBUG nova.objects.instance [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid b0411876-4519-4bcb-a325-000d02d8b59d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.106 186985 DEBUG nova.virt.libvirt.vif [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:08:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1810802209',display_name='tempest-TestNetworkBasicOps-server-1810802209',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1810802209',id=7,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAbbXXsIxD9wxsfv66fFL+lSheLjpBiFb5vj3NTwETNWQ8ZD8FwpAhGjO41WyXXNuX9cxL/oHwQYwznwPPVtjIv+ZMh0inUp0Q00EufkgXKzcCZ/Bjs7PX3eoTREcBk1jw==',key_name='tempest-TestNetworkBasicOps-1595292033',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:08:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-pokmlw76',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:08:10Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=b0411876-4519-4bcb-a325-000d02d8b59d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.106 186985 DEBUG nova.network.os_vif_util [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "address": "fa:16:3e:8b:32:cc", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41558171-90", "ovs_interfaceid": "41558171-90e6-4dc7-9cc3-1edd109bc81a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.107 186985 DEBUG nova.network.os_vif_util [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:32:cc,bridge_name='br-int',has_traffic_filtering=True,id=41558171-90e6-4dc7-9cc3-1edd109bc81a,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41558171-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.107 186985 DEBUG os_vif [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:32:cc,bridge_name='br-int',has_traffic_filtering=True,id=41558171-90e6-4dc7-9cc3-1edd109bc81a,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41558171-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.108 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.109 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41558171-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.110 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.113 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.116 186985 INFO os_vif [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:32:cc,bridge_name='br-int',has_traffic_filtering=True,id=41558171-90e6-4dc7-9cc3-1edd109bc81a,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41558171-90')
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.116 186985 INFO nova.virt.libvirt.driver [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Deleting instance files /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d_del
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.117 186985 INFO nova.virt.libvirt.driver [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Deletion of /var/lib/nova/instances/b0411876-4519-4bcb-a325-000d02d8b59d_del complete
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.242 186985 INFO nova.compute.manager [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.242 186985 DEBUG oslo.service.loopingcall [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.242 186985 DEBUG nova.compute.manager [-] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.243 186985 DEBUG nova.network.neutron [-] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.956 186985 DEBUG nova.compute.manager [req-c8c13922-a928-4265-9192-2985e9117cd4 req-cd4a2e93-247c-4a62-802f-bda2c7d76133 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received event network-vif-unplugged-41558171-90e6-4dc7-9cc3-1edd109bc81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.957 186985 DEBUG oslo_concurrency.lockutils [req-c8c13922-a928-4265-9192-2985e9117cd4 req-cd4a2e93-247c-4a62-802f-bda2c7d76133 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.957 186985 DEBUG oslo_concurrency.lockutils [req-c8c13922-a928-4265-9192-2985e9117cd4 req-cd4a2e93-247c-4a62-802f-bda2c7d76133 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.957 186985 DEBUG oslo_concurrency.lockutils [req-c8c13922-a928-4265-9192-2985e9117cd4 req-cd4a2e93-247c-4a62-802f-bda2c7d76133 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.957 186985 DEBUG nova.compute.manager [req-c8c13922-a928-4265-9192-2985e9117cd4 req-cd4a2e93-247c-4a62-802f-bda2c7d76133 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] No waiting events found dispatching network-vif-unplugged-41558171-90e6-4dc7-9cc3-1edd109bc81a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:01 compute-0 nova_compute[186981]: 2025-11-22 10:09:01.957 186985 DEBUG nova.compute.manager [req-c8c13922-a928-4265-9192-2985e9117cd4 req-cd4a2e93-247c-4a62-802f-bda2c7d76133 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received event network-vif-unplugged-41558171-90e6-4dc7-9cc3-1edd109bc81a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:09:02 compute-0 ovn_controller[95329]: 2025-11-22T10:09:02Z|00108|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 22 10:09:02 compute-0 nova_compute[186981]: 2025-11-22 10:09:02.955 186985 DEBUG nova.network.neutron [-] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:02 compute-0 nova_compute[186981]: 2025-11-22 10:09:02.974 186985 INFO nova.compute.manager [-] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Took 1.73 seconds to deallocate network for instance.
Nov 22 10:09:03 compute-0 nova_compute[186981]: 2025-11-22 10:09:03.020 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:03 compute-0 nova_compute[186981]: 2025-11-22 10:09:03.020 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:03 compute-0 nova_compute[186981]: 2025-11-22 10:09:03.076 186985 DEBUG nova.compute.provider_tree [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:09:03 compute-0 nova_compute[186981]: 2025-11-22 10:09:03.088 186985 DEBUG nova.scheduler.client.report [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:09:03 compute-0 nova_compute[186981]: 2025-11-22 10:09:03.106 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:03 compute-0 nova_compute[186981]: 2025-11-22 10:09:03.132 186985 INFO nova.scheduler.client.report [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance b0411876-4519-4bcb-a325-000d02d8b59d
Nov 22 10:09:03 compute-0 nova_compute[186981]: 2025-11-22 10:09:03.203 186985 DEBUG oslo_concurrency.lockutils [None req-a351219e-60de-4101-bc9a-e16b2b61379f fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:04 compute-0 nova_compute[186981]: 2025-11-22 10:09:04.077 186985 DEBUG nova.compute.manager [req-543f522a-889f-4370-80c8-97ef990932d4 req-d6eb8f0c-fff9-4020-a517-7b5663593e11 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received event network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:04 compute-0 nova_compute[186981]: 2025-11-22 10:09:04.078 186985 DEBUG oslo_concurrency.lockutils [req-543f522a-889f-4370-80c8-97ef990932d4 req-d6eb8f0c-fff9-4020-a517-7b5663593e11 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:04 compute-0 nova_compute[186981]: 2025-11-22 10:09:04.078 186985 DEBUG oslo_concurrency.lockutils [req-543f522a-889f-4370-80c8-97ef990932d4 req-d6eb8f0c-fff9-4020-a517-7b5663593e11 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:04 compute-0 nova_compute[186981]: 2025-11-22 10:09:04.078 186985 DEBUG oslo_concurrency.lockutils [req-543f522a-889f-4370-80c8-97ef990932d4 req-d6eb8f0c-fff9-4020-a517-7b5663593e11 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "b0411876-4519-4bcb-a325-000d02d8b59d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:04 compute-0 nova_compute[186981]: 2025-11-22 10:09:04.078 186985 DEBUG nova.compute.manager [req-543f522a-889f-4370-80c8-97ef990932d4 req-d6eb8f0c-fff9-4020-a517-7b5663593e11 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] No waiting events found dispatching network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:04 compute-0 nova_compute[186981]: 2025-11-22 10:09:04.079 186985 WARNING nova.compute.manager [req-543f522a-889f-4370-80c8-97ef990932d4 req-d6eb8f0c-fff9-4020-a517-7b5663593e11 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received unexpected event network-vif-plugged-41558171-90e6-4dc7-9cc3-1edd109bc81a for instance with vm_state deleted and task_state None.
Nov 22 10:09:04 compute-0 nova_compute[186981]: 2025-11-22 10:09:04.079 186985 DEBUG nova.compute.manager [req-543f522a-889f-4370-80c8-97ef990932d4 req-d6eb8f0c-fff9-4020-a517-7b5663593e11 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Received event network-vif-deleted-41558171-90e6-4dc7-9cc3-1edd109bc81a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.075 186985 DEBUG oslo_concurrency.lockutils [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "interface-d5460be9-d4a4-45e1-8bd1-99144801279c-47b46c17-414f-45b6-b0f7-72fc46a774d5" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.075 186985 DEBUG oslo_concurrency.lockutils [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-d5460be9-d4a4-45e1-8bd1-99144801279c-47b46c17-414f-45b6-b0f7-72fc46a774d5" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.089 186985 DEBUG nova.objects.instance [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'flavor' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.108 186985 DEBUG nova.virt.libvirt.vif [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:07:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:07:23Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.109 186985 DEBUG nova.network.os_vif_util [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.109 186985 DEBUG nova.network.os_vif_util [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.111 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.114 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.115 186985 DEBUG nova.virt.libvirt.driver [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Attempting to detach device tap47b46c17-41 from instance d5460be9-d4a4-45e1-8bd1-99144801279c from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.116 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] detach device xml: <interface type="ethernet">
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <mac address="fa:16:3e:cb:e5:b6"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <model type="virtio"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <mtu size="1442"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <target dev="tap47b46c17-41"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]: </interface>
Nov 22 10:09:05 compute-0 nova_compute[186981]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.152 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.155 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <name>instance-00000006</name>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <uuid>d5460be9-d4a4-45e1-8bd1-99144801279c</uuid>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:07:52</nova:creationTime>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:port uuid="47b46c17-414f-45b6-b0f7-72fc46a774d5">
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:09:05 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <system>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='serial'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='uuid'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </system>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <os>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </os>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <features>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </features>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk' index='2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config' index='1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:d0:6e:90'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target dev='tapb4bd60c8-94'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:cb:e5:b6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target dev='tap47b46c17-41'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='net1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       </target>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </console>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <video>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </video>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c260,c820</label>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c260,c820</imagelabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]: </domain>
Nov 22 10:09:05 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.156 186985 INFO nova.virt.libvirt.driver [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully detached device tap47b46c17-41 from instance d5460be9-d4a4-45e1-8bd1-99144801279c from the persistent domain config.
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.157 186985 DEBUG nova.virt.libvirt.driver [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] (1/8): Attempting to detach device tap47b46c17-41 with device alias net1 from instance d5460be9-d4a4-45e1-8bd1-99144801279c from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.157 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] detach device xml: <interface type="ethernet">
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <mac address="fa:16:3e:cb:e5:b6"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <model type="virtio"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <mtu size="1442"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <target dev="tap47b46c17-41"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]: </interface>
Nov 22 10:09:05 compute-0 nova_compute[186981]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 22 10:09:05 compute-0 kernel: tap47b46c17-41 (unregistering): left promiscuous mode
Nov 22 10:09:05 compute-0 NetworkManager[55425]: <info>  [1763806145.2228] device (tap47b46c17-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:09:05 compute-0 ovn_controller[95329]: 2025-11-22T10:09:05Z|00109|binding|INFO|Releasing lport 47b46c17-414f-45b6-b0f7-72fc46a774d5 from this chassis (sb_readonly=0)
Nov 22 10:09:05 compute-0 ovn_controller[95329]: 2025-11-22T10:09:05Z|00110|binding|INFO|Setting lport 47b46c17-414f-45b6-b0f7-72fc46a774d5 down in Southbound
Nov 22 10:09:05 compute-0 ovn_controller[95329]: 2025-11-22T10:09:05Z|00111|binding|INFO|Removing iface tap47b46c17-41 ovn-installed in OVS
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.269 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.270 186985 DEBUG nova.virt.libvirt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Received event <DeviceRemovedEvent: 1763806145.2703412, d5460be9-d4a4-45e1-8bd1-99144801279c => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.271 186985 DEBUG nova.virt.libvirt.driver [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Start waiting for the detach event from libvirt for device tap47b46c17-41 with device alias net1 for instance d5460be9-d4a4-45e1-8bd1-99144801279c _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.271 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:09:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:05.273 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:e5:b6 10.100.0.24', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302b0ac8-02d4-44fd-b1fa-ce5720e457ac, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=47b46c17-414f-45b6-b0f7-72fc46a774d5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.274 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <name>instance-00000006</name>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <uuid>d5460be9-d4a4-45e1-8bd1-99144801279c</uuid>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:07:52</nova:creationTime>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:port uuid="47b46c17-414f-45b6-b0f7-72fc46a774d5">
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:09:05 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <system>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='serial'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='uuid'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </system>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <os>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </os>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <features>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </features>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk' index='2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config' index='1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:d0:6e:90'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target dev='tapb4bd60c8-94'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       </target>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </console>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:09:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:05.274 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 47b46c17-414f-45b6-b0f7-72fc46a774d5 in datapath c442911c-33e7-4086-a8a7-29e86a0c5c15 unbound from our chassis
Nov 22 10:09:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:05.275 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c442911c-33e7-4086-a8a7-29e86a0c5c15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <video>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </video>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c260,c820</label>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c260,c820</imagelabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:05 compute-0 nova_compute[186981]: </domain>
Nov 22 10:09:05 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.274 186985 INFO nova.virt.libvirt.driver [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully detached device tap47b46c17-41 from instance d5460be9-d4a4-45e1-8bd1-99144801279c from the live domain config.
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.275 186985 DEBUG nova.virt.libvirt.vif [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:07:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:07:23Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.275 186985 DEBUG nova.network.os_vif_util [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.276 186985 DEBUG nova.network.os_vif_util [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.276 186985 DEBUG os_vif [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:09:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:05.276 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[724343ca-b0a0-4a72-a124-b6359ffeaac9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:05.277 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15 namespace which is not needed anymore
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.278 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.278 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47b46c17-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.279 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.279 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.281 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.283 186985 INFO os_vif [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41')
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.283 186985 DEBUG nova.virt.libvirt.guest [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:09:05</nova:creationTime>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:09:05 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:09:05 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:05 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:09:05 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:09:05 compute-0 nova_compute[186981]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 10:09:05 compute-0 podman[216839]: 2025-11-22 10:09:05.357271902 +0000 UTC m=+0.059991509 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 10:09:05 compute-0 podman[216841]: 2025-11-22 10:09:05.379391456 +0000 UTC m=+0.082111683 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 10:09:05 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [NOTICE]   (216393) : haproxy version is 2.8.14-c23fe91
Nov 22 10:09:05 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [NOTICE]   (216393) : path to executable is /usr/sbin/haproxy
Nov 22 10:09:05 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [WARNING]  (216393) : Exiting Master process...
Nov 22 10:09:05 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [WARNING]  (216393) : Exiting Master process...
Nov 22 10:09:05 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [ALERT]    (216393) : Current worker (216402) exited with code 143 (Terminated)
Nov 22 10:09:05 compute-0 neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15[216375]: [WARNING]  (216393) : All workers exited. Exiting... (0)
Nov 22 10:09:05 compute-0 systemd[1]: libpod-9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135.scope: Deactivated successfully.
Nov 22 10:09:05 compute-0 podman[216903]: 2025-11-22 10:09:05.51684557 +0000 UTC m=+0.145330569 container died 9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 10:09:05 compute-0 nova_compute[186981]: 2025-11-22 10:09:05.557 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135-userdata-shm.mount: Deactivated successfully.
Nov 22 10:09:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-5708ac4736a24e3c32eac78d634c6f6840f0220d78b4b0e2357b79c87b8e5f4f-merged.mount: Deactivated successfully.
Nov 22 10:09:05 compute-0 podman[216903]: 2025-11-22 10:09:05.861084071 +0000 UTC m=+0.489569080 container cleanup 9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 10:09:05 compute-0 systemd[1]: libpod-conmon-9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135.scope: Deactivated successfully.
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.086 186985 DEBUG oslo_concurrency.lockutils [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.087 186985 DEBUG oslo_concurrency.lockutils [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.087 186985 DEBUG nova.network.neutron [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.169 186985 DEBUG nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-unplugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.169 186985 DEBUG oslo_concurrency.lockutils [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.171 186985 DEBUG oslo_concurrency.lockutils [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.171 186985 DEBUG oslo_concurrency.lockutils [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.172 186985 DEBUG nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] No waiting events found dispatching network-vif-unplugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.172 186985 WARNING nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received unexpected event network-vif-unplugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 for instance with vm_state active and task_state None.
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.172 186985 DEBUG nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.173 186985 DEBUG oslo_concurrency.lockutils [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.173 186985 DEBUG oslo_concurrency.lockutils [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.173 186985 DEBUG oslo_concurrency.lockutils [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.174 186985 DEBUG nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] No waiting events found dispatching network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.174 186985 WARNING nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received unexpected event network-vif-plugged-47b46c17-414f-45b6-b0f7-72fc46a774d5 for instance with vm_state active and task_state None.
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.174 186985 DEBUG nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-deleted-47b46c17-414f-45b6-b0f7-72fc46a774d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.174 186985 INFO nova.compute.manager [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Neutron deleted interface 47b46c17-414f-45b6-b0f7-72fc46a774d5; detaching it from the instance and deleting it from the info cache
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.175 186985 DEBUG nova.network.neutron [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.202 186985 DEBUG nova.objects.instance [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lazy-loading 'system_metadata' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.228 186985 DEBUG nova.objects.instance [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lazy-loading 'flavor' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.249 186985 DEBUG nova.virt.libvirt.vif [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:07:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:07:23Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:09:06 compute-0 podman[216939]: 2025-11-22 10:09:06.250368382 +0000 UTC m=+0.367944539 container remove 9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.250 186985 DEBUG nova.network.os_vif_util [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converting VIF {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.251 186985 DEBUG nova.network.os_vif_util [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.254 186985 DEBUG nova.virt.libvirt.guest [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.258 186985 DEBUG nova.virt.libvirt.guest [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <name>instance-00000006</name>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <uuid>d5460be9-d4a4-45e1-8bd1-99144801279c</uuid>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:09:05</nova:creationTime>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:09:06 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.258 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f856de41-f4e8-4778-bf3f-e79f69c57e9b]: (4, ('Sat Nov 22 10:09:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15 (9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135)\n9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135\nSat Nov 22 10:09:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15 (9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135)\n9502d42bc52963254240a65e2f75d1f7fe98ea91e093fa29320f8af6e62da135\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <system>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='serial'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='uuid'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </system>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <os>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </os>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <features>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </features>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:09:06 compute-0 kernel: tapc442911c-30: left promiscuous mode
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk' index='2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config' index='1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.259 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c2f27e-5919-45e3-88fd-364bf205448d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.260 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc442911c-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:d0:6e:90'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target dev='tapb4bd60c8-94'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       </target>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </console>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <video>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </video>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c260,c820</label>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c260,c820</imagelabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]: </domain>
Nov 22 10:09:06 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.258 186985 DEBUG nova.virt.libvirt.guest [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.262 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.264 186985 DEBUG nova.virt.libvirt.guest [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cb:e5:b6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47b46c17-41"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <name>instance-00000006</name>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <uuid>d5460be9-d4a4-45e1-8bd1-99144801279c</uuid>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:09:05</nova:creationTime>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:09:06 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <memory unit='KiB'>131072</memory>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <vcpu placement='static'>1</vcpu>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <resource>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <partition>/machine</partition>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </resource>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <sysinfo type='smbios'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <system>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='manufacturer'>RDO</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='serial'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='uuid'>d5460be9-d4a4-45e1-8bd1-99144801279c</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <entry name='family'>Virtual Machine</entry>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </system>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <os>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <boot dev='hd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <smbios mode='sysinfo'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </os>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <features>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <vmcoreinfo state='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </features>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <vendor>AMD</vendor>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='x2apic'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc-deadline'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='hypervisor'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='tsc_adjust'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='spec-ctrl'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='stibp'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='ssbd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='cmp_legacy'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='overflow-recov'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='succor'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='ibrs'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='amd-ssbd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='virt-ssbd'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='lbrv'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='tsc-scale'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='vmcb-clean'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='flushbyasid'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='pause-filter'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='pfthreshold'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='xsaves'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='svm'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='require' name='topoext'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='npt'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <feature policy='disable' name='nrip-save'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <clock offset='utc'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <timer name='hpet' present='no'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <on_poweroff>destroy</on_poweroff>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <on_reboot>restart</on_reboot>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <on_crash>destroy</on_crash>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <disk type='file' device='disk'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk' index='2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <backingStore type='file' index='3'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <format type='raw'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <source file='/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <backingStore/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       </backingStore>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target dev='vda' bus='virtio'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='virtio-disk0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <disk type='file' device='cdrom'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/disk.config' index='1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <backingStore/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target dev='sda' bus='sata'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <readonly/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='sata0-0-0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pcie.0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='1' port='0x10'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='2' port='0x11'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='3' port='0x12'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='4' port='0x13'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='5' port='0x14'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='6' port='0x15'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='7' port='0x16'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='8' port='0x17'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.8'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='9' port='0x18'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.9'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='10' port='0x19'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.10'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='11' port='0x1a'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.11'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='12' port='0x1b'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.12'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='13' port='0x1c'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.13'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='14' port='0x1d'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.14'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='15' port='0x1e'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.15'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='16' port='0x1f'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.16'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='17' port='0x20'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.17'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='18' port='0x21'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.18'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='19' port='0x22'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.19'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='20' port='0x23'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.20'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='21' port='0x24'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.21'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='22' port='0x25'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.22'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='23' port='0x26'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.23'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='24' port='0x27'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.24'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-root-port'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target chassis='25' port='0x28'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.25'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model name='pcie-pci-bridge'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='pci.26'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='usb'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <controller type='sata' index='0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='ide'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </controller>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <interface type='ethernet'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <mac address='fa:16:3e:d0:6e:90'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target dev='tapb4bd60c8-94'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model type='virtio'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <mtu size='1442'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='net0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <serial type='pty'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target type='isa-serial' port='0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:         <model name='isa-serial'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       </target>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <source path='/dev/pts/0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <log file='/var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c/console.log' append='off'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <target type='serial' port='0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='serial0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </console>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <input type='tablet' bus='usb'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='input0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='usb' bus='0' port='1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <input type='mouse' bus='ps2'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='input1'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <input type='keyboard' bus='ps2'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='input2'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </input>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <listen type='address' address='::0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </graphics>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <audio id='1' type='none'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <video>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='video0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </video>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <watchdog model='itco' action='reset'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='watchdog0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </watchdog>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <memballoon model='virtio'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <stats period='10'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='balloon0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <rng model='virtio'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <backend model='random'>/dev/urandom</backend>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <alias name='rng0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <label>system_u:system_r:svirt_t:s0:c260,c820</label>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c260,c820</imagelabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <label>+107:+107</label>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <imagelabel>+107:+107</imagelabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </seclabel>
Nov 22 10:09:06 compute-0 nova_compute[186981]: </domain>
Nov 22 10:09:06 compute-0 nova_compute[186981]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.264 186985 WARNING nova.virt.libvirt.driver [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Detaching interface fa:16:3e:cb:e5:b6 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap47b46c17-41' not found.
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.265 186985 DEBUG nova.virt.libvirt.vif [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:07:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:07:23Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.266 186985 DEBUG nova.network.os_vif_util [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converting VIF {"id": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "address": "fa:16:3e:cb:e5:b6", "network": {"id": "c442911c-33e7-4086-a8a7-29e86a0c5c15", "bridge": "br-int", "label": "tempest-network-smoke--1967909756", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b46c17-41", "ovs_interfaceid": "47b46c17-414f-45b6-b0f7-72fc46a774d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.267 186985 DEBUG nova.network.os_vif_util [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.267 186985 DEBUG os_vif [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.271 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.271 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47b46c17-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.271 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.278 186985 INFO os_vif [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:e5:b6,bridge_name='br-int',has_traffic_filtering=True,id=47b46c17-414f-45b6-b0f7-72fc46a774d5,network=Network(c442911c-33e7-4086-a8a7-29e86a0c5c15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b46c17-41')
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.279 186985 DEBUG nova.virt.libvirt.guest [req-0b01f556-b557-423d-b766-031028a7cd01 req-226c9b14-1116-4141-9c6b-0b0c8547b54a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:name>tempest-TestNetworkBasicOps-server-1174788018</nova:name>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:creationTime>2025-11-22 10:09:06</nova:creationTime>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:flavor name="m1.nano">
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:memory>128</nova:memory>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:disk>1</nova:disk>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:swap>0</nova:swap>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:flavor>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:owner>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:owner>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   <nova:ports>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     <nova:port uuid="b4bd60c8-946f-4124-b413-02ee57a5b597">
Nov 22 10:09:06 compute-0 nova_compute[186981]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 10:09:06 compute-0 nova_compute[186981]:     </nova:port>
Nov 22 10:09:06 compute-0 nova_compute[186981]:   </nova:ports>
Nov 22 10:09:06 compute-0 nova_compute[186981]: </nova:instance>
Nov 22 10:09:06 compute-0 nova_compute[186981]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.291 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.293 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[0a24a17b-7e6e-425d-87b0-2e1d1d7c5136]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.312 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[85dfc701-f757-47f8-8beb-bb8a52e2806b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.313 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[653e830c-db74-442b-bc3a-99cb10bc6363]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.334 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1a7937-2e27-4c97-9d2a-28b4a5199bc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353036, 'reachable_time': 16155, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216955, 'error': None, 'target': 'ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.336 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c442911c-33e7-4086-a8a7-29e86a0c5c15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.337 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[c6452dbb-bb50-454c-a3ba-8def84812406]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:06 compute-0 systemd[1]: run-netns-ovnmeta\x2dc442911c\x2d33e7\x2d4086\x2da8a7\x2d29e86a0c5c15.mount: Deactivated successfully.
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.720 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:09:06 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:06.720 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:09:06 compute-0 nova_compute[186981]: 2025-11-22 10:09:06.750 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:07 compute-0 ovn_controller[95329]: 2025-11-22T10:09:07Z|00112|binding|INFO|Releasing lport 7e6fffde-8524-45a3-90aa-146144523c34 from this chassis (sb_readonly=0)
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.140 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.534 186985 INFO nova.network.neutron [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Port 47b46c17-414f-45b6-b0f7-72fc46a774d5 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.534 186985 DEBUG nova.network.neutron [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.558 186985 DEBUG oslo_concurrency.lockutils [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.581 186985 DEBUG oslo_concurrency.lockutils [None req-480f847d-ef3e-42f5-a63b-87c46a1445ce fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "interface-d5460be9-d4a4-45e1-8bd1-99144801279c-47b46c17-414f-45b6-b0f7-72fc46a774d5" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.731 186985 DEBUG nova.compute.manager [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-changed-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.732 186985 DEBUG nova.compute.manager [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing instance network info cache due to event network-changed-b4bd60c8-946f-4124-b413-02ee57a5b597. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.732 186985 DEBUG oslo_concurrency.lockutils [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.733 186985 DEBUG oslo_concurrency.lockutils [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.733 186985 DEBUG nova.network.neutron [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Refreshing network info cache for port b4bd60c8-946f-4124-b413-02ee57a5b597 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.771 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.771 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.772 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.772 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.772 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.774 186985 INFO nova.compute.manager [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Terminating instance
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.775 186985 DEBUG nova.compute.manager [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:09:07 compute-0 kernel: tapb4bd60c8-94 (unregistering): left promiscuous mode
Nov 22 10:09:07 compute-0 NetworkManager[55425]: <info>  [1763806147.8513] device (tapb4bd60c8-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.854 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:07 compute-0 ovn_controller[95329]: 2025-11-22T10:09:07Z|00113|binding|INFO|Releasing lport b4bd60c8-946f-4124-b413-02ee57a5b597 from this chassis (sb_readonly=0)
Nov 22 10:09:07 compute-0 ovn_controller[95329]: 2025-11-22T10:09:07Z|00114|binding|INFO|Setting lport b4bd60c8-946f-4124-b413-02ee57a5b597 down in Southbound
Nov 22 10:09:07 compute-0 ovn_controller[95329]: 2025-11-22T10:09:07Z|00115|binding|INFO|Removing iface tapb4bd60c8-94 ovn-installed in OVS
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.857 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:07.865 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:6e:90 10.100.0.7'], port_security=['fa:16:3e:d0:6e:90 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd5460be9-d4a4-45e1-8bd1-99144801279c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f5906cda-5c1a-4e21-9e63-b78db27a3837', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aae3095-cd5a-4c64-be68-4ceb75b321b5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=b4bd60c8-946f-4124-b413-02ee57a5b597) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:09:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:07.868 104216 INFO neutron.agent.ovn.metadata.agent [-] Port b4bd60c8-946f-4124-b413-02ee57a5b597 in datapath 3b46282d-b3ed-40b7-90ce-65aaeac61049 unbound from our chassis
Nov 22 10:09:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:07.869 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3b46282d-b3ed-40b7-90ce-65aaeac61049, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:09:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:07.870 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[3da76f05-4d72-4699-8caf-86a7a7a171bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:07 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:07.871 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049 namespace which is not needed anymore
Nov 22 10:09:07 compute-0 nova_compute[186981]: 2025-11-22 10:09:07.878 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:07 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 22 10:09:07 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 17.361s CPU time.
Nov 22 10:09:07 compute-0 systemd-machined[153303]: Machine qemu-6-instance-00000006 terminated.
Nov 22 10:09:07 compute-0 NetworkManager[55425]: <info>  [1763806147.9958] manager: (tapb4bd60c8-94): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Nov 22 10:09:08 compute-0 neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049[216115]: [NOTICE]   (216131) : haproxy version is 2.8.14-c23fe91
Nov 22 10:09:08 compute-0 neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049[216115]: [NOTICE]   (216131) : path to executable is /usr/sbin/haproxy
Nov 22 10:09:08 compute-0 neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049[216115]: [WARNING]  (216131) : Exiting Master process...
Nov 22 10:09:08 compute-0 neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049[216115]: [ALERT]    (216131) : Current worker (216142) exited with code 143 (Terminated)
Nov 22 10:09:08 compute-0 neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049[216115]: [WARNING]  (216131) : All workers exited. Exiting... (0)
Nov 22 10:09:08 compute-0 systemd[1]: libpod-be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47.scope: Deactivated successfully.
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.034 186985 INFO nova.virt.libvirt.driver [-] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Instance destroyed successfully.
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.035 186985 DEBUG nova.objects.instance [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid d5460be9-d4a4-45e1-8bd1-99144801279c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:08 compute-0 podman[216976]: 2025-11-22 10:09:08.039725218 +0000 UTC m=+0.081000033 container died be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.051 186985 DEBUG nova.virt.libvirt.vif [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:07:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174788018',display_name='tempest-TestNetworkBasicOps-server-1174788018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174788018',id=6,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+BWSuiLhQPxAECcK7DaVlWzFtnG0hn0O+hqo9OO4MlApMhNsc33zI/cmxJx6fZIyL5GfThNk2CtY3og8M02CpWqQtXFgtJTqIB8zeQxnYsQ//S5ibUsgIqYg8zuPI+Jg==',key_name='tempest-TestNetworkBasicOps-371597924',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:07:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-ir8n07cx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:07:23Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=d5460be9-d4a4-45e1-8bd1-99144801279c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.052 186985 DEBUG nova.network.os_vif_util [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.052 186985 DEBUG nova.network.os_vif_util [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d0:6e:90,bridge_name='br-int',has_traffic_filtering=True,id=b4bd60c8-946f-4124-b413-02ee57a5b597,network=Network(3b46282d-b3ed-40b7-90ce-65aaeac61049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4bd60c8-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.053 186985 DEBUG os_vif [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:6e:90,bridge_name='br-int',has_traffic_filtering=True,id=b4bd60c8-946f-4124-b413-02ee57a5b597,network=Network(3b46282d-b3ed-40b7-90ce-65aaeac61049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4bd60c8-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.054 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.054 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb4bd60c8-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.057 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.060 186985 INFO os_vif [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:6e:90,bridge_name='br-int',has_traffic_filtering=True,id=b4bd60c8-946f-4124-b413-02ee57a5b597,network=Network(3b46282d-b3ed-40b7-90ce-65aaeac61049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4bd60c8-94')
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.061 186985 INFO nova.virt.libvirt.driver [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Deleting instance files /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c_del
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.062 186985 INFO nova.virt.libvirt.driver [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Deletion of /var/lib/nova/instances/d5460be9-d4a4-45e1-8bd1-99144801279c_del complete
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.111 186985 INFO nova.compute.manager [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.112 186985 DEBUG oslo.service.loopingcall [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.112 186985 DEBUG nova.compute.manager [-] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.113 186985 DEBUG nova.network.neutron [-] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:09:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47-userdata-shm.mount: Deactivated successfully.
Nov 22 10:09:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-e90a496b9b3e18a8aeb44b5f0de1dddba49e03b6b283abcd947acbfb55608d3b-merged.mount: Deactivated successfully.
Nov 22 10:09:08 compute-0 podman[216976]: 2025-11-22 10:09:08.87395806 +0000 UTC m=+0.915232875 container cleanup be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:09:08 compute-0 systemd[1]: libpod-conmon-be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47.scope: Deactivated successfully.
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.912 186985 DEBUG nova.network.neutron [-] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.933 186985 INFO nova.compute.manager [-] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Took 0.82 seconds to deallocate network for instance.
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.983 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:08 compute-0 nova_compute[186981]: 2025-11-22 10:09:08.983 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.077 186985 DEBUG nova.compute.provider_tree [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.092 186985 DEBUG nova.scheduler.client.report [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.115 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:09 compute-0 podman[217022]: 2025-11-22 10:09:09.129992091 +0000 UTC m=+0.225922050 container remove be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.137 186985 INFO nova.scheduler.client.report [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance d5460be9-d4a4-45e1-8bd1-99144801279c
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.137 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d3bad1a0-f820-4bc7-a4d6-b3fbc6c34ec0]: (4, ('Sat Nov 22 10:09:07 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049 (be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47)\nbe4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47\nSat Nov 22 10:09:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049 (be4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47)\nbe4ae72d3e561fa6da0bc888b63e7a239a23d1928c31413aa4626c7a19849b47\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.139 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[23a064d4-e936-470a-9a3b-0d3dcbd5fe29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.140 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b46282d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.143 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:09 compute-0 kernel: tap3b46282d-b0: left promiscuous mode
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.154 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.157 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[78f845bf-5f20-4acd-aaaf-d2548c3e9910]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.171 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbf64a6-05da-4a01-9237-7555004333fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.173 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[19bab392-ce1e-4ec9-bed7-5c517bf58888]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.188 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e7c768-07f9-4359-a136-c1b5fefbed31]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349970, 'reachable_time': 36908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217036, 'error': None, 'target': 'ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d3b46282d\x2db3ed\x2d40b7\x2d90ce\x2d65aaeac61049.mount: Deactivated successfully.
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.190 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3b46282d-b3ed-40b7-90ce-65aaeac61049 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:09:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:09.191 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8b982b-3cbb-4e2d-81ac-8f1b2ac27fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.207 186985 DEBUG oslo_concurrency.lockutils [None req-d9de54b5-548f-4036-954f-36300836881b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.241 186985 DEBUG nova.network.neutron [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updated VIF entry in instance network info cache for port b4bd60c8-946f-4124-b413-02ee57a5b597. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.241 186985 DEBUG nova.network.neutron [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Updating instance_info_cache with network_info: [{"id": "b4bd60c8-946f-4124-b413-02ee57a5b597", "address": "fa:16:3e:d0:6e:90", "network": {"id": "3b46282d-b3ed-40b7-90ce-65aaeac61049", "bridge": "br-int", "label": "tempest-network-smoke--1408989801", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4bd60c8-94", "ovs_interfaceid": "b4bd60c8-946f-4124-b413-02ee57a5b597", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.257 186985 DEBUG oslo_concurrency.lockutils [req-38f26ec9-60be-49fa-885b-d279a4b05078 req-17e54f90-4389-4d24-adde-344b28cf87eb 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-d5460be9-d4a4-45e1-8bd1-99144801279c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.826 186985 DEBUG nova.compute.manager [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-unplugged-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.827 186985 DEBUG oslo_concurrency.lockutils [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.827 186985 DEBUG oslo_concurrency.lockutils [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.827 186985 DEBUG oslo_concurrency.lockutils [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.828 186985 DEBUG nova.compute.manager [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] No waiting events found dispatching network-vif-unplugged-b4bd60c8-946f-4124-b413-02ee57a5b597 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.828 186985 WARNING nova.compute.manager [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received unexpected event network-vif-unplugged-b4bd60c8-946f-4124-b413-02ee57a5b597 for instance with vm_state deleted and task_state None.
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.829 186985 DEBUG nova.compute.manager [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.829 186985 DEBUG oslo_concurrency.lockutils [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.829 186985 DEBUG oslo_concurrency.lockutils [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.830 186985 DEBUG oslo_concurrency.lockutils [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "d5460be9-d4a4-45e1-8bd1-99144801279c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.830 186985 DEBUG nova.compute.manager [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] No waiting events found dispatching network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.830 186985 WARNING nova.compute.manager [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received unexpected event network-vif-plugged-b4bd60c8-946f-4124-b413-02ee57a5b597 for instance with vm_state deleted and task_state None.
Nov 22 10:09:09 compute-0 nova_compute[186981]: 2025-11-22 10:09:09.831 186985 DEBUG nova.compute.manager [req-0c320db6-96b3-4418-859c-070a6a1445e6 req-d72f2af8-d70a-4c9e-a180-a019c20def79 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Received event network-vif-deleted-b4bd60c8-946f-4124-b413-02ee57a5b597 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:10 compute-0 nova_compute[186981]: 2025-11-22 10:09:10.558 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:10 compute-0 podman[217042]: 2025-11-22 10:09:10.620353371 +0000 UTC m=+0.068222303 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, managed_by=edpm_ansible)
Nov 22 10:09:10 compute-0 podman[217041]: 2025-11-22 10:09:10.637545571 +0000 UTC m=+0.089608528 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 10:09:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:10.722 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:13 compute-0 nova_compute[186981]: 2025-11-22 10:09:13.090 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:13 compute-0 nova_compute[186981]: 2025-11-22 10:09:13.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:13 compute-0 nova_compute[186981]: 2025-11-22 10:09:13.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:09:13 compute-0 nova_compute[186981]: 2025-11-22 10:09:13.662 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:09:14 compute-0 podman[217080]: 2025-11-22 10:09:14.616184803 +0000 UTC m=+0.061447709 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:09:14 compute-0 podman[217081]: 2025-11-22 10:09:14.65489721 +0000 UTC m=+0.093736251 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.560 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.613 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.613 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.613 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.613 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.814 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.816 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5770MB free_disk=73.45889282226562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.816 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.816 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.883 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.884 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.909 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.927 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.957 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:09:15 compute-0 nova_compute[186981]: 2025-11-22 10:09:15.957 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:16 compute-0 nova_compute[186981]: 2025-11-22 10:09:16.059 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806141.0584235, b0411876-4519-4bcb-a325-000d02d8b59d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:09:16 compute-0 nova_compute[186981]: 2025-11-22 10:09:16.060 186985 INFO nova.compute.manager [-] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] VM Stopped (Lifecycle Event)
Nov 22 10:09:16 compute-0 nova_compute[186981]: 2025-11-22 10:09:16.076 186985 DEBUG nova.compute.manager [None req-8668eba0-d680-4271-a381-71b0eba2bf9b - - - - - -] [instance: b0411876-4519-4bcb-a325-000d02d8b59d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:09:16 compute-0 nova_compute[186981]: 2025-11-22 10:09:16.311 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:16 compute-0 nova_compute[186981]: 2025-11-22 10:09:16.452 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:17.937 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:17.937 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:17.938 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:17 compute-0 nova_compute[186981]: 2025-11-22 10:09:17.957 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:17 compute-0 nova_compute[186981]: 2025-11-22 10:09:17.958 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:17 compute-0 nova_compute[186981]: 2025-11-22 10:09:17.958 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:09:18 compute-0 nova_compute[186981]: 2025-11-22 10:09:18.092 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:19 compute-0 nova_compute[186981]: 2025-11-22 10:09:19.590 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:19 compute-0 nova_compute[186981]: 2025-11-22 10:09:19.590 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:20 compute-0 nova_compute[186981]: 2025-11-22 10:09:20.562 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:22 compute-0 nova_compute[186981]: 2025-11-22 10:09:22.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:09:23 compute-0 nova_compute[186981]: 2025-11-22 10:09:23.034 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806148.0316966, d5460be9-d4a4-45e1-8bd1-99144801279c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:09:23 compute-0 nova_compute[186981]: 2025-11-22 10:09:23.034 186985 INFO nova.compute.manager [-] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] VM Stopped (Lifecycle Event)
Nov 22 10:09:23 compute-0 nova_compute[186981]: 2025-11-22 10:09:23.066 186985 DEBUG nova.compute.manager [None req-06edd025-bd34-4946-a8d0-dab1b9926d6a - - - - - -] [instance: d5460be9-d4a4-45e1-8bd1-99144801279c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:09:23 compute-0 nova_compute[186981]: 2025-11-22 10:09:23.094 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:25 compute-0 nova_compute[186981]: 2025-11-22 10:09:25.563 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:25 compute-0 podman[217127]: 2025-11-22 10:09:25.60261327 +0000 UTC m=+0.053208584 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:09:28 compute-0 nova_compute[186981]: 2025-11-22 10:09:28.098 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:30 compute-0 nova_compute[186981]: 2025-11-22 10:09:30.566 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.398 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.398 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.411 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.491 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.492 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.500 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.500 186985 INFO nova.compute.claims [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.611 186985 DEBUG nova.compute.provider_tree [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.628 186985 DEBUG nova.scheduler.client.report [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.653 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.654 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.706 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.706 186985 DEBUG nova.network.neutron [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.753 186985 INFO nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.806 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.920 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.921 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.922 186985 INFO nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Creating image(s)
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.922 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.922 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.923 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:31 compute-0 nova_compute[186981]: 2025-11-22 10:09:31.934 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.006 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.007 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.009 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.032 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.098 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.099 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.135 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.136 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.136 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.227 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.229 186985 DEBUG nova.virt.disk.api [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.229 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.285 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.286 186985 DEBUG nova.virt.disk.api [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.287 186985 DEBUG nova.objects.instance [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 1ed4e0cb-a7d4-4735-b408-704c1e6af103 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.315 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.316 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Ensure instance console log exists: /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.317 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.317 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.318 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:32 compute-0 nova_compute[186981]: 2025-11-22 10:09:32.892 186985 DEBUG nova.policy [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:09:33 compute-0 nova_compute[186981]: 2025-11-22 10:09:33.101 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.572 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:35 compute-0 podman[217167]: 2025-11-22 10:09:35.651510774 +0000 UTC m=+0.105987025 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 10:09:35 compute-0 podman[217168]: 2025-11-22 10:09:35.6689173 +0000 UTC m=+0.116083312 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.829 186985 DEBUG nova.network.neutron [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Successfully updated port: 0c051980-3a8d-48bb-9bf2-70309e50f76f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.861 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.861 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.861 186985 DEBUG nova.network.neutron [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.940 186985 DEBUG nova.compute.manager [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received event network-changed-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.940 186985 DEBUG nova.compute.manager [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Refreshing instance network info cache due to event network-changed-0c051980-3a8d-48bb-9bf2-70309e50f76f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:09:35 compute-0 nova_compute[186981]: 2025-11-22 10:09:35.941 186985 DEBUG oslo_concurrency.lockutils [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:09:36 compute-0 nova_compute[186981]: 2025-11-22 10:09:36.059 186985 DEBUG nova.network.neutron [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.735 186985 DEBUG nova.network.neutron [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Updating instance_info_cache with network_info: [{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.813 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.814 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Instance network_info: |[{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.814 186985 DEBUG oslo_concurrency.lockutils [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.814 186985 DEBUG nova.network.neutron [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Refreshing network info cache for port 0c051980-3a8d-48bb-9bf2-70309e50f76f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.817 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Start _get_guest_xml network_info=[{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.822 186985 WARNING nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.827 186985 DEBUG nova.virt.libvirt.host [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.827 186985 DEBUG nova.virt.libvirt.host [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.831 186985 DEBUG nova.virt.libvirt.host [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.832 186985 DEBUG nova.virt.libvirt.host [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.832 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.833 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.834 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.834 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.835 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.835 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.835 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.836 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.836 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.837 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.837 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.837 186985 DEBUG nova.virt.hardware [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.842 186985 DEBUG nova.virt.libvirt.vif [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:09:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889314149',display_name='tempest-TestNetworkBasicOps-server-889314149',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889314149',id=8,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOuO8gZJmd4GIU5/Mlkmp2LS/WXbBfklBSHTRD8Pn7ukzFfSpZn6afUz0/rL1MpDERyTWSLjctvjF5b7pTi2j6kp0RA+zwGrfzot1xqw/ah8C+rZia7K3tk/DlcdIsS8/g==',key_name='tempest-TestNetworkBasicOps-1262090559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-y75xifml',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:09:31Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=1ed4e0cb-a7d4-4735-b408-704c1e6af103,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.843 186985 DEBUG nova.network.os_vif_util [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.844 186985 DEBUG nova.network.os_vif_util [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.845 186985 DEBUG nova.objects.instance [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1ed4e0cb-a7d4-4735-b408-704c1e6af103 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.874 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <uuid>1ed4e0cb-a7d4-4735-b408-704c1e6af103</uuid>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <name>instance-00000008</name>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-889314149</nova:name>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:09:37</nova:creationTime>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         <nova:port uuid="0c051980-3a8d-48bb-9bf2-70309e50f76f">
Nov 22 10:09:37 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <system>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <entry name="serial">1ed4e0cb-a7d4-4735-b408-704c1e6af103</entry>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <entry name="uuid">1ed4e0cb-a7d4-4735-b408-704c1e6af103</entry>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </system>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <os>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   </os>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <features>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   </features>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk.config"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:34:59:3b"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <target dev="tap0c051980-3a"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/console.log" append="off"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <video>
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </video>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:09:37 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:09:37 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:09:37 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:09:37 compute-0 nova_compute[186981]: </domain>
Nov 22 10:09:37 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.875 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Preparing to wait for external event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.876 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.876 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.876 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.877 186985 DEBUG nova.virt.libvirt.vif [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:09:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889314149',display_name='tempest-TestNetworkBasicOps-server-889314149',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889314149',id=8,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOuO8gZJmd4GIU5/Mlkmp2LS/WXbBfklBSHTRD8Pn7ukzFfSpZn6afUz0/rL1MpDERyTWSLjctvjF5b7pTi2j6kp0RA+zwGrfzot1xqw/ah8C+rZia7K3tk/DlcdIsS8/g==',key_name='tempest-TestNetworkBasicOps-1262090559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-y75xifml',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:09:31Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=1ed4e0cb-a7d4-4735-b408-704c1e6af103,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.877 186985 DEBUG nova.network.os_vif_util [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.878 186985 DEBUG nova.network.os_vif_util [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.878 186985 DEBUG os_vif [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.879 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.879 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.880 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:09:37 compute-0 NetworkManager[55425]: <info>  [1763806177.8904] manager: (tap0c051980-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.885 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.886 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c051980-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.887 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c051980-3a, col_values=(('external_ids', {'iface-id': '0c051980-3a8d-48bb-9bf2-70309e50f76f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:59:3b', 'vm-uuid': '1ed4e0cb-a7d4-4735-b408-704c1e6af103'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.890 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.894 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.897 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:37 compute-0 nova_compute[186981]: 2025-11-22 10:09:37.899 186985 INFO os_vif [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a')
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.000 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.000 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.001 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:34:59:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.001 186985 INFO nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Using config drive
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.680 186985 INFO nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Creating config drive at /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk.config
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.684 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjkn6g7np execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.823 186985 DEBUG oslo_concurrency.processutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjkn6g7np" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:38 compute-0 NetworkManager[55425]: <info>  [1763806178.9062] manager: (tap0c051980-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Nov 22 10:09:38 compute-0 kernel: tap0c051980-3a: entered promiscuous mode
Nov 22 10:09:38 compute-0 ovn_controller[95329]: 2025-11-22T10:09:38Z|00116|binding|INFO|Claiming lport 0c051980-3a8d-48bb-9bf2-70309e50f76f for this chassis.
Nov 22 10:09:38 compute-0 ovn_controller[95329]: 2025-11-22T10:09:38Z|00117|binding|INFO|0c051980-3a8d-48bb-9bf2-70309e50f76f: Claiming fa:16:3e:34:59:3b 10.100.0.9
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.908 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.916 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.921 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.932 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:59:3b 10.100.0.9'], port_security=['fa:16:3e:34:59:3b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ed4e0cb-a7d4-4735-b408-704c1e6af103', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27c5a67c-dc4c-4d67-b4f1-e6a36c0e1eec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31a25dd6-387e-429c-9754-fa9b4c2f743d, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=0c051980-3a8d-48bb-9bf2-70309e50f76f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.934 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 0c051980-3a8d-48bb-9bf2-70309e50f76f in datapath 1d7ca0ab-f499-4866-82d7-e753ea2e04cb bound to our chassis
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.936 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d7ca0ab-f499-4866-82d7-e753ea2e04cb
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.948 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2d347ee7-949a-446c-8778-57b6310818d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.949 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d7ca0ab-f1 in ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:09:38 compute-0 systemd-udevd[217234]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.951 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d7ca0ab-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.951 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb1660a-9cba-4f70-8ce8-7bf6c7da7ecd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.952 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d5cae9-eeb9-4ac8-87a5-461e57cd5ff2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:38 compute-0 systemd-machined[153303]: New machine qemu-8-instance-00000008.
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.963 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[8882f4c6-ca17-4c7c-a6b1-8083bee608a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:38 compute-0 NetworkManager[55425]: <info>  [1763806178.9666] device (tap0c051980-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:09:38 compute-0 NetworkManager[55425]: <info>  [1763806178.9676] device (tap0c051980-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:09:38 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Nov 22 10:09:38 compute-0 ovn_controller[95329]: 2025-11-22T10:09:38Z|00118|binding|INFO|Setting lport 0c051980-3a8d-48bb-9bf2-70309e50f76f ovn-installed in OVS
Nov 22 10:09:38 compute-0 ovn_controller[95329]: 2025-11-22T10:09:38Z|00119|binding|INFO|Setting lport 0c051980-3a8d-48bb-9bf2-70309e50f76f up in Southbound
Nov 22 10:09:38 compute-0 nova_compute[186981]: 2025-11-22 10:09:38.983 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:38 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:38.992 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7d21d1a9-7c5b-4489-8f91-a4f531e5e953]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.031 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dbb006-ef5b-4a7b-89f5-1ef7c000875c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 NetworkManager[55425]: <info>  [1763806179.0376] manager: (tap1d7ca0ab-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.036 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1d752d5d-99ed-4daf-b039-20f1c49ec5f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.067 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[31e8646d-82c4-4ee2-b711-00867620fa1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.070 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[fe544d46-c78f-4a3c-966f-ce5750b06977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 NetworkManager[55425]: <info>  [1763806179.0962] device (tap1d7ca0ab-f0): carrier: link connected
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.102 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[d41e2b03-178b-466c-9512-6dfb85a3d69b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.121 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[11f97a0f-3599-4eb6-8f28-e3612aa956a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7ca0ab-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:9f:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363680, 'reachable_time': 34890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217267, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.139 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[0c996d82-b014-44a9-8979-64ea5342440f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:9fcf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363680, 'tstamp': 363680}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217269, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.156 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[14b2c225-19e9-4699-a1e8-7be6cf9ef87e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7ca0ab-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:9f:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363680, 'reachable_time': 34890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217270, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.187 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[ee91b7fd-2871-4698-b439-5b0b928d384a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.238 186985 DEBUG nova.compute.manager [req-120f879d-a0bf-458a-a77c-b61a55207073 req-a934c5fc-af46-4e0a-9040-1195e188348a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.239 186985 DEBUG oslo_concurrency.lockutils [req-120f879d-a0bf-458a-a77c-b61a55207073 req-a934c5fc-af46-4e0a-9040-1195e188348a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.239 186985 DEBUG oslo_concurrency.lockutils [req-120f879d-a0bf-458a-a77c-b61a55207073 req-a934c5fc-af46-4e0a-9040-1195e188348a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.240 186985 DEBUG oslo_concurrency.lockutils [req-120f879d-a0bf-458a-a77c-b61a55207073 req-a934c5fc-af46-4e0a-9040-1195e188348a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.240 186985 DEBUG nova.compute.manager [req-120f879d-a0bf-458a-a77c-b61a55207073 req-a934c5fc-af46-4e0a-9040-1195e188348a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Processing event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.245 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[31fd0100-80cb-4ad8-a3ea-846d5f083553]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.246 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7ca0ab-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.246 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.246 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d7ca0ab-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.248 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:39 compute-0 NetworkManager[55425]: <info>  [1763806179.2490] manager: (tap1d7ca0ab-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 22 10:09:39 compute-0 kernel: tap1d7ca0ab-f0: entered promiscuous mode
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.250 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.251 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d7ca0ab-f0, col_values=(('external_ids', {'iface-id': '6723ff38-f191-4910-bd16-36a1a7c95572'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.252 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:39 compute-0 ovn_controller[95329]: 2025-11-22T10:09:39Z|00120|binding|INFO|Releasing lport 6723ff38-f191-4910-bd16-36a1a7c95572 from this chassis (sb_readonly=0)
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.274 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.275 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.275 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.276 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[0b4d6b34-aa7e-412b-a318-9761da73aaae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.277 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-1d7ca0ab-f499-4866-82d7-e753ea2e04cb
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.pid.haproxy
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID 1d7ca0ab-f499-4866-82d7-e753ea2e04cb
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:09:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:39.278 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'env', 'PROCESS_TAG=haproxy-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.305 186985 DEBUG nova.network.neutron [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Updated VIF entry in instance network info cache for port 0c051980-3a8d-48bb-9bf2-70309e50f76f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.305 186985 DEBUG nova.network.neutron [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Updating instance_info_cache with network_info: [{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:39 compute-0 nova_compute[186981]: 2025-11-22 10:09:39.323 186985 DEBUG oslo_concurrency.lockutils [req-be4c16ec-c71e-4c00-8520-3df37042e2df req-247e46ce-b85e-4ada-a6d9-6efb56678d71 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:09:39 compute-0 podman[217302]: 2025-11-22 10:09:39.66065418 +0000 UTC m=+0.060873694 container create 052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:09:39 compute-0 systemd[1]: Started libpod-conmon-052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81.scope.
Nov 22 10:09:39 compute-0 podman[217302]: 2025-11-22 10:09:39.628002048 +0000 UTC m=+0.028221552 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:09:39 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:09:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3863f1e413b45cd6703419965c96867bf246bb32bbf6e1c21dadcd13a7a5e656/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:09:39 compute-0 podman[217302]: 2025-11-22 10:09:39.753461644 +0000 UTC m=+0.153681158 container init 052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 10:09:39 compute-0 podman[217302]: 2025-11-22 10:09:39.763071386 +0000 UTC m=+0.163290870 container start 052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:09:39 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [NOTICE]   (217323) : New worker (217325) forked
Nov 22 10:09:39 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [NOTICE]   (217323) : Loading success.
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.052 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806180.0521092, 1ed4e0cb-a7d4-4735-b408-704c1e6af103 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.053 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] VM Started (Lifecycle Event)
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.055 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.060 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.064 186985 INFO nova.virt.libvirt.driver [-] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Instance spawned successfully.
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.064 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.083 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.092 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.099 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.099 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.100 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.101 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.102 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.103 186985 DEBUG nova.virt.libvirt.driver [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.134 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.135 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806180.0521927, 1ed4e0cb-a7d4-4735-b408-704c1e6af103 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.135 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] VM Paused (Lifecycle Event)
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.166 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.170 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806180.0592604, 1ed4e0cb-a7d4-4735-b408-704c1e6af103 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.171 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] VM Resumed (Lifecycle Event)
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.179 186985 INFO nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Took 8.26 seconds to spawn the instance on the hypervisor.
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.179 186985 DEBUG nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.190 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.194 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.221 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.259 186985 INFO nova.compute.manager [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Took 8.80 seconds to build instance.
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.277 186985 DEBUG oslo_concurrency.lockutils [None req-2d5c267f-ceab-4684-84e5-d410d8c1d9b4 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:40 compute-0 nova_compute[186981]: 2025-11-22 10:09:40.616 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:41 compute-0 nova_compute[186981]: 2025-11-22 10:09:41.304 186985 DEBUG nova.compute.manager [req-e5b8c862-c218-4da3-9130-5d0db4377f3e req-3e476339-a2d9-4cd6-b16b-457472b3f73d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:41 compute-0 nova_compute[186981]: 2025-11-22 10:09:41.304 186985 DEBUG oslo_concurrency.lockutils [req-e5b8c862-c218-4da3-9130-5d0db4377f3e req-3e476339-a2d9-4cd6-b16b-457472b3f73d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:41 compute-0 nova_compute[186981]: 2025-11-22 10:09:41.304 186985 DEBUG oslo_concurrency.lockutils [req-e5b8c862-c218-4da3-9130-5d0db4377f3e req-3e476339-a2d9-4cd6-b16b-457472b3f73d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:41 compute-0 nova_compute[186981]: 2025-11-22 10:09:41.305 186985 DEBUG oslo_concurrency.lockutils [req-e5b8c862-c218-4da3-9130-5d0db4377f3e req-3e476339-a2d9-4cd6-b16b-457472b3f73d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:41 compute-0 nova_compute[186981]: 2025-11-22 10:09:41.305 186985 DEBUG nova.compute.manager [req-e5b8c862-c218-4da3-9130-5d0db4377f3e req-3e476339-a2d9-4cd6-b16b-457472b3f73d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] No waiting events found dispatching network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:41 compute-0 nova_compute[186981]: 2025-11-22 10:09:41.305 186985 WARNING nova.compute.manager [req-e5b8c862-c218-4da3-9130-5d0db4377f3e req-3e476339-a2d9-4cd6-b16b-457472b3f73d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received unexpected event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f for instance with vm_state active and task_state None.
Nov 22 10:09:41 compute-0 podman[217341]: 2025-11-22 10:09:41.647372485 +0000 UTC m=+0.094220965 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:09:41 compute-0 podman[217342]: 2025-11-22 10:09:41.664837311 +0000 UTC m=+0.098181342 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 22 10:09:42 compute-0 nova_compute[186981]: 2025-11-22 10:09:42.890 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:45 compute-0 nova_compute[186981]: 2025-11-22 10:09:45.619 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:45 compute-0 podman[217385]: 2025-11-22 10:09:45.640488141 +0000 UTC m=+0.096854165 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:09:45 compute-0 podman[217386]: 2025-11-22 10:09:45.640542233 +0000 UTC m=+0.085089444 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 10:09:46 compute-0 ovn_controller[95329]: 2025-11-22T10:09:46Z|00121|binding|INFO|Releasing lport 6723ff38-f191-4910-bd16-36a1a7c95572 from this chassis (sb_readonly=0)
Nov 22 10:09:46 compute-0 NetworkManager[55425]: <info>  [1763806186.4138] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.412 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:46 compute-0 NetworkManager[55425]: <info>  [1763806186.4165] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 22 10:09:46 compute-0 ovn_controller[95329]: 2025-11-22T10:09:46Z|00122|binding|INFO|Releasing lport 6723ff38-f191-4910-bd16-36a1a7c95572 from this chassis (sb_readonly=0)
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.449 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.453 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.764 186985 DEBUG nova.compute.manager [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received event network-changed-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.765 186985 DEBUG nova.compute.manager [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Refreshing instance network info cache due to event network-changed-0c051980-3a8d-48bb-9bf2-70309e50f76f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.765 186985 DEBUG oslo_concurrency.lockutils [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.766 186985 DEBUG oslo_concurrency.lockutils [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.766 186985 DEBUG nova.network.neutron [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Refreshing network info cache for port 0c051980-3a8d-48bb-9bf2-70309e50f76f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.989 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.990 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.990 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.990 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.991 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.992 186985 INFO nova.compute.manager [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Terminating instance
Nov 22 10:09:46 compute-0 nova_compute[186981]: 2025-11-22 10:09:46.993 186985 DEBUG nova.compute.manager [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:09:47 compute-0 kernel: tap0c051980-3a (unregistering): left promiscuous mode
Nov 22 10:09:47 compute-0 NetworkManager[55425]: <info>  [1763806187.0134] device (tap0c051980-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.023 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:47 compute-0 ovn_controller[95329]: 2025-11-22T10:09:47Z|00123|binding|INFO|Releasing lport 0c051980-3a8d-48bb-9bf2-70309e50f76f from this chassis (sb_readonly=0)
Nov 22 10:09:47 compute-0 ovn_controller[95329]: 2025-11-22T10:09:47Z|00124|binding|INFO|Setting lport 0c051980-3a8d-48bb-9bf2-70309e50f76f down in Southbound
Nov 22 10:09:47 compute-0 ovn_controller[95329]: 2025-11-22T10:09:47Z|00125|binding|INFO|Removing iface tap0c051980-3a ovn-installed in OVS
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.027 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.035 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:59:3b 10.100.0.9'], port_security=['fa:16:3e:34:59:3b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ed4e0cb-a7d4-4735-b408-704c1e6af103', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '27c5a67c-dc4c-4d67-b4f1-e6a36c0e1eec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31a25dd6-387e-429c-9754-fa9b4c2f743d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=0c051980-3a8d-48bb-9bf2-70309e50f76f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.036 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 0c051980-3a8d-48bb-9bf2-70309e50f76f in datapath 1d7ca0ab-f499-4866-82d7-e753ea2e04cb unbound from our chassis
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.037 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7ca0ab-f499-4866-82d7-e753ea2e04cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.038 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[eed5ef7b-0ce4-43fc-a2ad-44328b5ea86b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.040 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb namespace which is not needed anymore
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.041 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:47 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 22 10:09:47 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 8.219s CPU time.
Nov 22 10:09:47 compute-0 systemd-machined[153303]: Machine qemu-8-instance-00000008 terminated.
Nov 22 10:09:47 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [NOTICE]   (217323) : haproxy version is 2.8.14-c23fe91
Nov 22 10:09:47 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [NOTICE]   (217323) : path to executable is /usr/sbin/haproxy
Nov 22 10:09:47 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [WARNING]  (217323) : Exiting Master process...
Nov 22 10:09:47 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [WARNING]  (217323) : Exiting Master process...
Nov 22 10:09:47 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [ALERT]    (217323) : Current worker (217325) exited with code 143 (Terminated)
Nov 22 10:09:47 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217319]: [WARNING]  (217323) : All workers exited. Exiting... (0)
Nov 22 10:09:47 compute-0 systemd[1]: libpod-052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81.scope: Deactivated successfully.
Nov 22 10:09:47 compute-0 podman[217453]: 2025-11-22 10:09:47.169309072 +0000 UTC m=+0.046800849 container died 052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 10:09:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81-userdata-shm.mount: Deactivated successfully.
Nov 22 10:09:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-3863f1e413b45cd6703419965c96867bf246bb32bbf6e1c21dadcd13a7a5e656-merged.mount: Deactivated successfully.
Nov 22 10:09:47 compute-0 podman[217453]: 2025-11-22 10:09:47.205773868 +0000 UTC m=+0.083265655 container cleanup 052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 10:09:47 compute-0 systemd[1]: libpod-conmon-052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81.scope: Deactivated successfully.
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.264 186985 INFO nova.virt.libvirt.driver [-] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Instance destroyed successfully.
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.265 186985 DEBUG nova.objects.instance [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 1ed4e0cb-a7d4-4735-b408-704c1e6af103 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:47 compute-0 podman[217486]: 2025-11-22 10:09:47.280814847 +0000 UTC m=+0.052829473 container remove 052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.282 186985 DEBUG nova.virt.libvirt.vif [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:09:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-889314149',display_name='tempest-TestNetworkBasicOps-server-889314149',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-889314149',id=8,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOuO8gZJmd4GIU5/Mlkmp2LS/WXbBfklBSHTRD8Pn7ukzFfSpZn6afUz0/rL1MpDERyTWSLjctvjF5b7pTi2j6kp0RA+zwGrfzot1xqw/ah8C+rZia7K3tk/DlcdIsS8/g==',key_name='tempest-TestNetworkBasicOps-1262090559',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:09:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-y75xifml',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:09:40Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=1ed4e0cb-a7d4-4735-b408-704c1e6af103,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.282 186985 DEBUG nova.network.os_vif_util [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.284 186985 DEBUG nova.network.os_vif_util [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.284 186985 DEBUG os_vif [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.287 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.288 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c051980-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.290 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c3a9dd-5daa-4f66-9266-c81c250a57b5]: (4, ('Sat Nov 22 10:09:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb (052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81)\n052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81\nSat Nov 22 10:09:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb (052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81)\n052f60920cb3cf0e78c8d2f1e71ee73e05fb9c8351ea2aab6007b32b2c9cee81\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.292 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1d67e19d-4da7-489f-a9dc-60742d9c24de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.293 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7ca0ab-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.339 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:47 compute-0 kernel: tap1d7ca0ab-f0: left promiscuous mode
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.341 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.350 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.352 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[ee8920f7-45c7-4053-89e1-f514422f0d00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.355 186985 INFO os_vif [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a')
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.356 186985 INFO nova.virt.libvirt.driver [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Deleting instance files /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103_del
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.357 186985 INFO nova.virt.libvirt.driver [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Deletion of /var/lib/nova/instances/1ed4e0cb-a7d4-4735-b408-704c1e6af103_del complete
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.372 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0ac398-8c4b-470f-a221-eb5a688dd9c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.374 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5df543f8-56ba-4e4c-8da4-971707f37d98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.388 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a46db7-eeb4-4180-8298-7a2e1b842a73]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363673, 'reachable_time': 29121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217514, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.391 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:09:47 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:09:47.391 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[18824b0c-8376-4575-a43a-84eb994d0547]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:09:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d1d7ca0ab\x2df499\x2d4866\x2d82d7\x2de753ea2e04cb.mount: Deactivated successfully.
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.412 186985 INFO nova.compute.manager [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.413 186985 DEBUG oslo.service.loopingcall [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.413 186985 DEBUG nova.compute.manager [-] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:09:47 compute-0 nova_compute[186981]: 2025-11-22 10:09:47.414 186985 DEBUG nova.network.neutron [-] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.629 186985 DEBUG nova.network.neutron [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Updated VIF entry in instance network info cache for port 0c051980-3a8d-48bb-9bf2-70309e50f76f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.630 186985 DEBUG nova.network.neutron [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Updating instance_info_cache with network_info: [{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.651 186985 DEBUG oslo_concurrency.lockutils [req-7456ff6f-e8a1-4030-a11e-0cfa46cc93ce req-0e4836cb-5a36-49b4-98fb-db584933b5f0 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-1ed4e0cb-a7d4-4735-b408-704c1e6af103" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.880 186985 DEBUG nova.compute.manager [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received event network-vif-unplugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.881 186985 DEBUG oslo_concurrency.lockutils [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.881 186985 DEBUG oslo_concurrency.lockutils [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.881 186985 DEBUG oslo_concurrency.lockutils [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.881 186985 DEBUG nova.compute.manager [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] No waiting events found dispatching network-vif-unplugged-0c051980-3a8d-48bb-9bf2-70309e50f76f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.882 186985 DEBUG nova.compute.manager [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received event network-vif-unplugged-0c051980-3a8d-48bb-9bf2-70309e50f76f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.882 186985 DEBUG nova.compute.manager [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.882 186985 DEBUG oslo_concurrency.lockutils [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.882 186985 DEBUG oslo_concurrency.lockutils [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.882 186985 DEBUG oslo_concurrency.lockutils [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.883 186985 DEBUG nova.compute.manager [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] No waiting events found dispatching network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.883 186985 WARNING nova.compute.manager [req-b42f2851-4f1b-4e83-b5f4-d1b633610a09 req-e3c7c7bf-8a09-4c21-9bf9-e6d905e5f531 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Received unexpected event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f for instance with vm_state active and task_state deleting.
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.957 186985 DEBUG nova.network.neutron [-] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:48 compute-0 nova_compute[186981]: 2025-11-22 10:09:48.978 186985 INFO nova.compute.manager [-] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Took 1.56 seconds to deallocate network for instance.
Nov 22 10:09:49 compute-0 nova_compute[186981]: 2025-11-22 10:09:49.049 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:49 compute-0 nova_compute[186981]: 2025-11-22 10:09:49.050 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:49 compute-0 nova_compute[186981]: 2025-11-22 10:09:49.154 186985 DEBUG nova.compute.provider_tree [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:09:49 compute-0 nova_compute[186981]: 2025-11-22 10:09:49.176 186985 DEBUG nova.scheduler.client.report [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:09:49 compute-0 nova_compute[186981]: 2025-11-22 10:09:49.203 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:49 compute-0 nova_compute[186981]: 2025-11-22 10:09:49.225 186985 INFO nova.scheduler.client.report [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 1ed4e0cb-a7d4-4735-b408-704c1e6af103
Nov 22 10:09:49 compute-0 nova_compute[186981]: 2025-11-22 10:09:49.300 186985 DEBUG oslo_concurrency.lockutils [None req-2929d2bc-c0c2-4655-b9c8-e777295f5736 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "1ed4e0cb-a7d4-4735-b408-704c1e6af103" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:50 compute-0 nova_compute[186981]: 2025-11-22 10:09:50.667 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:52 compute-0 nova_compute[186981]: 2025-11-22 10:09:52.340 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.340 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.341 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.354 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.415 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.415 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.420 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.420 186985 INFO nova.compute.claims [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.513 186985 DEBUG nova.compute.provider_tree [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.525 186985 DEBUG nova.scheduler.client.report [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.543 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.544 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.604 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.605 186985 DEBUG nova.network.neutron [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.623 186985 INFO nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.645 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.669 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.750 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.752 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.753 186985 INFO nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Creating image(s)
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.753 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.754 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.755 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.779 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.838 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.839 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.839 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.849 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.905 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.906 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.945 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.946 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.946 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.997 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.998 186985 DEBUG nova.virt.disk.api [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:09:55 compute-0 nova_compute[186981]: 2025-11-22 10:09:55.999 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.051 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.052 186985 DEBUG nova.virt.disk.api [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.052 186985 DEBUG nova.objects.instance [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.082 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.083 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Ensure instance console log exists: /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.085 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.086 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.086 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:56 compute-0 nova_compute[186981]: 2025-11-22 10:09:56.290 186985 DEBUG nova.policy [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:09:56 compute-0 podman[217530]: 2025-11-22 10:09:56.621703876 +0000 UTC m=+0.063514555 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:09:57 compute-0 nova_compute[186981]: 2025-11-22 10:09:57.340 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.010 186985 DEBUG nova.network.neutron [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Successfully updated port: 0c051980-3a8d-48bb-9bf2-70309e50f76f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.025 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.026 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.026 186985 DEBUG nova.network.neutron [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.097 186985 DEBUG nova.compute.manager [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received event network-changed-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.098 186985 DEBUG nova.compute.manager [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Refreshing instance network info cache due to event network-changed-0c051980-3a8d-48bb-9bf2-70309e50f76f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.098 186985 DEBUG oslo_concurrency.lockutils [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:09:58 compute-0 nova_compute[186981]: 2025-11-22 10:09:58.197 186985 DEBUG nova.network.neutron [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.486 186985 DEBUG nova.network.neutron [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Updating instance_info_cache with network_info: [{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.506 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.507 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Instance network_info: |[{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.507 186985 DEBUG oslo_concurrency.lockutils [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.508 186985 DEBUG nova.network.neutron [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Refreshing network info cache for port 0c051980-3a8d-48bb-9bf2-70309e50f76f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.511 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Start _get_guest_xml network_info=[{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.517 186985 WARNING nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.525 186985 DEBUG nova.virt.libvirt.host [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.526 186985 DEBUG nova.virt.libvirt.host [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.530 186985 DEBUG nova.virt.libvirt.host [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.531 186985 DEBUG nova.virt.libvirt.host [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.531 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.532 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.532 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.533 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.533 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.533 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.534 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.534 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.534 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.535 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.535 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.535 186985 DEBUG nova.virt.hardware [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.540 186985 DEBUG nova.virt.libvirt.vif [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:09:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2037008835',display_name='tempest-TestNetworkBasicOps-server-2037008835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2037008835',id=9,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMTEfIrl/c1xgYtVrqGakXiYA1IVtQYlhcrTyMM8E8cJ1J/x4NOEOLBBoao1CAraZ1TrRTMANuxwWPxEsNhXA9n1TIVm6apinCfDVwZv5HQkI0mx0rR01x+cq2ZTHutFQw==',key_name='tempest-TestNetworkBasicOps-1710267300',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-tsou07b8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:09:55Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=230dfa9a-6ddb-49a0-90dd-c4b6b4c47183,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.541 186985 DEBUG nova.network.os_vif_util [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.542 186985 DEBUG nova.network.os_vif_util [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.543 186985 DEBUG nova.objects.instance [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.556 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <uuid>230dfa9a-6ddb-49a0-90dd-c4b6b4c47183</uuid>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <name>instance-00000009</name>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-2037008835</nova:name>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:09:59</nova:creationTime>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         <nova:port uuid="0c051980-3a8d-48bb-9bf2-70309e50f76f">
Nov 22 10:09:59 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <system>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <entry name="serial">230dfa9a-6ddb-49a0-90dd-c4b6b4c47183</entry>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <entry name="uuid">230dfa9a-6ddb-49a0-90dd-c4b6b4c47183</entry>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </system>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <os>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   </os>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <features>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   </features>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk.config"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:34:59:3b"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <target dev="tap0c051980-3a"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/console.log" append="off"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <video>
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </video>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:09:59 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:09:59 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:09:59 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:09:59 compute-0 nova_compute[186981]: </domain>
Nov 22 10:09:59 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.558 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Preparing to wait for external event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.559 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.560 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.561 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.562 186985 DEBUG nova.virt.libvirt.vif [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:09:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2037008835',display_name='tempest-TestNetworkBasicOps-server-2037008835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2037008835',id=9,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMTEfIrl/c1xgYtVrqGakXiYA1IVtQYlhcrTyMM8E8cJ1J/x4NOEOLBBoao1CAraZ1TrRTMANuxwWPxEsNhXA9n1TIVm6apinCfDVwZv5HQkI0mx0rR01x+cq2ZTHutFQw==',key_name='tempest-TestNetworkBasicOps-1710267300',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-tsou07b8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:09:55Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=230dfa9a-6ddb-49a0-90dd-c4b6b4c47183,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.563 186985 DEBUG nova.network.os_vif_util [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.565 186985 DEBUG nova.network.os_vif_util [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.566 186985 DEBUG os_vif [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.568 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.568 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.570 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.577 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.578 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c051980-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.579 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c051980-3a, col_values=(('external_ids', {'iface-id': '0c051980-3a8d-48bb-9bf2-70309e50f76f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:59:3b', 'vm-uuid': '230dfa9a-6ddb-49a0-90dd-c4b6b4c47183'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.583 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:59 compute-0 NetworkManager[55425]: <info>  [1763806199.5833] manager: (tap0c051980-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.587 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.590 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.593 186985 INFO os_vif [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a')
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.656 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.657 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.657 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:34:59:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:09:59 compute-0 nova_compute[186981]: 2025-11-22 10:09:59.658 186985 INFO nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Using config drive
Nov 22 10:10:00 compute-0 nova_compute[186981]: 2025-11-22 10:10:00.671 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:00 compute-0 nova_compute[186981]: 2025-11-22 10:10:00.713 186985 INFO nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Creating config drive at /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk.config
Nov 22 10:10:00 compute-0 nova_compute[186981]: 2025-11-22 10:10:00.720 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpag3u85ey execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:10:00 compute-0 nova_compute[186981]: 2025-11-22 10:10:00.846 186985 DEBUG oslo_concurrency.processutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpag3u85ey" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:10:00 compute-0 kernel: tap0c051980-3a: entered promiscuous mode
Nov 22 10:10:00 compute-0 NetworkManager[55425]: <info>  [1763806200.9069] manager: (tap0c051980-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Nov 22 10:10:00 compute-0 ovn_controller[95329]: 2025-11-22T10:10:00Z|00126|binding|INFO|Claiming lport 0c051980-3a8d-48bb-9bf2-70309e50f76f for this chassis.
Nov 22 10:10:00 compute-0 ovn_controller[95329]: 2025-11-22T10:10:00Z|00127|binding|INFO|0c051980-3a8d-48bb-9bf2-70309e50f76f: Claiming fa:16:3e:34:59:3b 10.100.0.9
Nov 22 10:10:00 compute-0 nova_compute[186981]: 2025-11-22 10:10:00.907 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.913 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:59:3b 10.100.0.9'], port_security=['fa:16:3e:34:59:3b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '230dfa9a-6ddb-49a0-90dd-c4b6b4c47183', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '7', 'neutron:security_group_ids': '27c5a67c-dc4c-4d67-b4f1-e6a36c0e1eec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31a25dd6-387e-429c-9754-fa9b4c2f743d, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=0c051980-3a8d-48bb-9bf2-70309e50f76f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.914 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 0c051980-3a8d-48bb-9bf2-70309e50f76f in datapath 1d7ca0ab-f499-4866-82d7-e753ea2e04cb bound to our chassis
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.915 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d7ca0ab-f499-4866-82d7-e753ea2e04cb
Nov 22 10:10:00 compute-0 ovn_controller[95329]: 2025-11-22T10:10:00Z|00128|binding|INFO|Setting lport 0c051980-3a8d-48bb-9bf2-70309e50f76f ovn-installed in OVS
Nov 22 10:10:00 compute-0 ovn_controller[95329]: 2025-11-22T10:10:00Z|00129|binding|INFO|Setting lport 0c051980-3a8d-48bb-9bf2-70309e50f76f up in Southbound
Nov 22 10:10:00 compute-0 nova_compute[186981]: 2025-11-22 10:10:00.919 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:00 compute-0 nova_compute[186981]: 2025-11-22 10:10:00.924 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.927 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff58d03-bf58-49ce-86ba-4d2f1dd582f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.929 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d7ca0ab-f1 in ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.931 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d7ca0ab-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.931 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[0e932559-e884-487f-8949-7033127eae75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.933 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[280b05a5-96e0-429d-a346-738f42916f65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.946 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[bbae1dfc-2086-4c98-a398-5177f8ddf795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:00 compute-0 systemd-udevd[217578]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:10:00 compute-0 systemd-machined[153303]: New machine qemu-9-instance-00000009.
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.959 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[01865d74-692f-4ee1-a9c0-7eff8366ef74]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:00 compute-0 NetworkManager[55425]: <info>  [1763806200.9646] device (tap0c051980-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:10:00 compute-0 NetworkManager[55425]: <info>  [1763806200.9658] device (tap0c051980-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:10:00 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.993 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[67777708-2f3d-4f56-81ac-59c37217c8fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:00 compute-0 NetworkManager[55425]: <info>  [1763806200.9990] manager: (tap1d7ca0ab-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Nov 22 10:10:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:00.998 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b966f7-346f-45fd-b247-210dbc888d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.043 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[95dbd108-a1f3-4a56-8504-bd443ef1bd6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.047 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[da6ee5c8-be86-40fb-ae2f-ef21de6bdc9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 NetworkManager[55425]: <info>  [1763806201.0766] device (tap1d7ca0ab-f0): carrier: link connected
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.086 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e9a88b-b052-4099-862e-504dd9c30acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.114 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[334a0f8a-da17-464c-a460-41d878b61521]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7ca0ab-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:9f:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365878, 'reachable_time': 23178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217609, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.139 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f8caf4b3-2986-44f9-94e5-6969e95728b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:9fcf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365878, 'tstamp': 365878}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217610, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.168 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[c095c457-22ae-49c2-b1f2-33bb98f192d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d7ca0ab-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:9f:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365878, 'reachable_time': 23178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217611, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.210 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[440c32d5-c08f-48f1-b90f-58922fe3eb3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.299 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[da20fe86-8e51-4cf7-8e49-2ff73a04a43f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.301 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7ca0ab-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.301 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.301 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d7ca0ab-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.303 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:01 compute-0 kernel: tap1d7ca0ab-f0: entered promiscuous mode
Nov 22 10:10:01 compute-0 NetworkManager[55425]: <info>  [1763806201.3046] manager: (tap1d7ca0ab-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.305 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d7ca0ab-f0, col_values=(('external_ids', {'iface-id': '6723ff38-f191-4910-bd16-36a1a7c95572'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.306 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:01 compute-0 ovn_controller[95329]: 2025-11-22T10:10:01Z|00130|binding|INFO|Releasing lport 6723ff38-f191-4910-bd16-36a1a7c95572 from this chassis (sb_readonly=0)
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.316 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.317 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.318 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[33aab9af-346d-4462-97fe-eb1d1ea360ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.319 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-1d7ca0ab-f499-4866-82d7-e753ea2e04cb
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.pid.haproxy
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID 1d7ca0ab-f499-4866-82d7-e753ea2e04cb
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:10:01 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:01.320 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'env', 'PROCESS_TAG=haproxy-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d7ca0ab-f499-4866-82d7-e753ea2e04cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.624 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806201.6237173, 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.624 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] VM Started (Lifecycle Event)
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.652 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.655 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806201.623922, 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.655 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] VM Paused (Lifecycle Event)
Nov 22 10:10:01 compute-0 podman[217650]: 2025-11-22 10:10:01.668877449 +0000 UTC m=+0.056865005 container create 96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:10:01 compute-0 systemd[1]: Started libpod-conmon-96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311.scope.
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.708 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.711 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:10:01 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:10:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90771fca274a59a6964e1be7f842d58176f84826abda8e08670ca19c1cdb938d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:10:01 compute-0 podman[217650]: 2025-11-22 10:10:01.63706727 +0000 UTC m=+0.025054906 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:10:01 compute-0 podman[217650]: 2025-11-22 10:10:01.740441663 +0000 UTC m=+0.128429229 container init 96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:10:01 compute-0 podman[217650]: 2025-11-22 10:10:01.747094285 +0000 UTC m=+0.135081851 container start 96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:10:01 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217665]: [NOTICE]   (217669) : New worker (217671) forked
Nov 22 10:10:01 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217665]: [NOTICE]   (217669) : Loading success.
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.768 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.955 186985 DEBUG nova.compute.manager [req-a9d430e7-465b-4c3a-8e88-f29c5d09699b req-75ead469-a458-4272-81d8-4110ac3da913 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.956 186985 DEBUG oslo_concurrency.lockutils [req-a9d430e7-465b-4c3a-8e88-f29c5d09699b req-75ead469-a458-4272-81d8-4110ac3da913 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.956 186985 DEBUG oslo_concurrency.lockutils [req-a9d430e7-465b-4c3a-8e88-f29c5d09699b req-75ead469-a458-4272-81d8-4110ac3da913 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.957 186985 DEBUG oslo_concurrency.lockutils [req-a9d430e7-465b-4c3a-8e88-f29c5d09699b req-75ead469-a458-4272-81d8-4110ac3da913 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.957 186985 DEBUG nova.compute.manager [req-a9d430e7-465b-4c3a-8e88-f29c5d09699b req-75ead469-a458-4272-81d8-4110ac3da913 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Processing event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.958 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.963 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806201.963427, 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.964 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] VM Resumed (Lifecycle Event)
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.967 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.972 186985 INFO nova.virt.libvirt.driver [-] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Instance spawned successfully.
Nov 22 10:10:01 compute-0 nova_compute[186981]: 2025-11-22 10:10:01.973 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.009 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.010 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.010 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.010 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.011 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.011 186985 DEBUG nova.virt.libvirt.driver [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.014 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.017 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.087 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.125 186985 INFO nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Took 6.37 seconds to spawn the instance on the hypervisor.
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.126 186985 DEBUG nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.261 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806187.2606719, 1ed4e0cb-a7d4-4735-b408-704c1e6af103 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.262 186985 INFO nova.compute.manager [-] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] VM Stopped (Lifecycle Event)
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.277 186985 INFO nova.compute.manager [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Took 6.88 seconds to build instance.
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.302 186985 DEBUG nova.compute.manager [None req-01a9a20e-4a62-4b36-a504-f703d91a9bd9 - - - - - -] [instance: 1ed4e0cb-a7d4-4735-b408-704c1e6af103] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.322 186985 DEBUG oslo_concurrency.lockutils [None req-a7f2a336-f637-4ba3-b741-f59e0173de22 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.937 186985 DEBUG nova.network.neutron [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Updated VIF entry in instance network info cache for port 0c051980-3a8d-48bb-9bf2-70309e50f76f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.938 186985 DEBUG nova.network.neutron [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Updating instance_info_cache with network_info: [{"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:10:02 compute-0 nova_compute[186981]: 2025-11-22 10:10:02.957 186985 DEBUG oslo_concurrency.lockutils [req-0aa104e3-320f-4845-bfe1-c98f9a215668 req-e4725bd1-61b9-4bea-84fd-9806110e972d 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:10:04 compute-0 nova_compute[186981]: 2025-11-22 10:10:04.089 186985 DEBUG nova.compute.manager [req-cd94d95f-7fcd-42f4-b846-903b8ac490ed req-e0c8890d-648e-4847-944e-195ad76a3b78 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:04 compute-0 nova_compute[186981]: 2025-11-22 10:10:04.090 186985 DEBUG oslo_concurrency.lockutils [req-cd94d95f-7fcd-42f4-b846-903b8ac490ed req-e0c8890d-648e-4847-944e-195ad76a3b78 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:04 compute-0 nova_compute[186981]: 2025-11-22 10:10:04.090 186985 DEBUG oslo_concurrency.lockutils [req-cd94d95f-7fcd-42f4-b846-903b8ac490ed req-e0c8890d-648e-4847-944e-195ad76a3b78 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:04 compute-0 nova_compute[186981]: 2025-11-22 10:10:04.091 186985 DEBUG oslo_concurrency.lockutils [req-cd94d95f-7fcd-42f4-b846-903b8ac490ed req-e0c8890d-648e-4847-944e-195ad76a3b78 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:04 compute-0 nova_compute[186981]: 2025-11-22 10:10:04.091 186985 DEBUG nova.compute.manager [req-cd94d95f-7fcd-42f4-b846-903b8ac490ed req-e0c8890d-648e-4847-944e-195ad76a3b78 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] No waiting events found dispatching network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:10:04 compute-0 nova_compute[186981]: 2025-11-22 10:10:04.092 186985 WARNING nova.compute.manager [req-cd94d95f-7fcd-42f4-b846-903b8ac490ed req-e0c8890d-648e-4847-944e-195ad76a3b78 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received unexpected event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f for instance with vm_state active and task_state None.
Nov 22 10:10:04 compute-0 nova_compute[186981]: 2025-11-22 10:10:04.582 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.240 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.240 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.241 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.241 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.242 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.243 186985 INFO nova.compute.manager [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Terminating instance
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.245 186985 DEBUG nova.compute.manager [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:10:05 compute-0 kernel: tap0c051980-3a (unregistering): left promiscuous mode
Nov 22 10:10:05 compute-0 NetworkManager[55425]: <info>  [1763806205.2720] device (tap0c051980-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:10:05 compute-0 ovn_controller[95329]: 2025-11-22T10:10:05Z|00131|binding|INFO|Releasing lport 0c051980-3a8d-48bb-9bf2-70309e50f76f from this chassis (sb_readonly=0)
Nov 22 10:10:05 compute-0 ovn_controller[95329]: 2025-11-22T10:10:05Z|00132|binding|INFO|Setting lport 0c051980-3a8d-48bb-9bf2-70309e50f76f down in Southbound
Nov 22 10:10:05 compute-0 ovn_controller[95329]: 2025-11-22T10:10:05Z|00133|binding|INFO|Removing iface tap0c051980-3a ovn-installed in OVS
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.278 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.287 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:59:3b 10.100.0.9'], port_security=['fa:16:3e:34:59:3b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '230dfa9a-6ddb-49a0-90dd-c4b6b4c47183', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-388161960', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '9', 'neutron:security_group_ids': '27c5a67c-dc4c-4d67-b4f1-e6a36c0e1eec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.239', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31a25dd6-387e-429c-9754-fa9b4c2f743d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=0c051980-3a8d-48bb-9bf2-70309e50f76f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.290 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 0c051980-3a8d-48bb-9bf2-70309e50f76f in datapath 1d7ca0ab-f499-4866-82d7-e753ea2e04cb unbound from our chassis
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.292 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7ca0ab-f499-4866-82d7-e753ea2e04cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.293 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[fc02911d-ab80-4cd8-a623-8d17b0feecec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.294 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb namespace which is not needed anymore
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.295 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 22 10:10:05 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 3.992s CPU time.
Nov 22 10:10:05 compute-0 systemd-machined[153303]: Machine qemu-9-instance-00000009 terminated.
Nov 22 10:10:05 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217665]: [NOTICE]   (217669) : haproxy version is 2.8.14-c23fe91
Nov 22 10:10:05 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217665]: [NOTICE]   (217669) : path to executable is /usr/sbin/haproxy
Nov 22 10:10:05 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217665]: [WARNING]  (217669) : Exiting Master process...
Nov 22 10:10:05 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217665]: [ALERT]    (217669) : Current worker (217671) exited with code 143 (Terminated)
Nov 22 10:10:05 compute-0 neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb[217665]: [WARNING]  (217669) : All workers exited. Exiting... (0)
Nov 22 10:10:05 compute-0 systemd[1]: libpod-96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311.scope: Deactivated successfully.
Nov 22 10:10:05 compute-0 podman[217704]: 2025-11-22 10:10:05.453861563 +0000 UTC m=+0.066961060 container died 96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:10:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311-userdata-shm.mount: Deactivated successfully.
Nov 22 10:10:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-90771fca274a59a6964e1be7f842d58176f84826abda8e08670ca19c1cdb938d-merged.mount: Deactivated successfully.
Nov 22 10:10:05 compute-0 podman[217704]: 2025-11-22 10:10:05.494111261 +0000 UTC m=+0.107210788 container cleanup 96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 10:10:05 compute-0 systemd[1]: libpod-conmon-96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311.scope: Deactivated successfully.
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.512 186985 INFO nova.virt.libvirt.driver [-] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Instance destroyed successfully.
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.512 186985 DEBUG nova.objects.instance [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.524 186985 DEBUG nova.virt.libvirt.vif [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:09:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2037008835',display_name='tempest-TestNetworkBasicOps-server-2037008835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2037008835',id=9,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMTEfIrl/c1xgYtVrqGakXiYA1IVtQYlhcrTyMM8E8cJ1J/x4NOEOLBBoao1CAraZ1TrRTMANuxwWPxEsNhXA9n1TIVm6apinCfDVwZv5HQkI0mx0rR01x+cq2ZTHutFQw==',key_name='tempest-TestNetworkBasicOps-1710267300',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:10:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-tsou07b8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:10:02Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=230dfa9a-6ddb-49a0-90dd-c4b6b4c47183,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.524 186985 DEBUG nova.network.os_vif_util [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "address": "fa:16:3e:34:59:3b", "network": {"id": "1d7ca0ab-f499-4866-82d7-e753ea2e04cb", "bridge": "br-int", "label": "tempest-network-smoke--2051683979", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c051980-3a", "ovs_interfaceid": "0c051980-3a8d-48bb-9bf2-70309e50f76f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.525 186985 DEBUG nova.network.os_vif_util [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.525 186985 DEBUG os_vif [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.527 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.527 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c051980-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.529 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.530 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.531 186985 INFO os_vif [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:59:3b,bridge_name='br-int',has_traffic_filtering=True,id=0c051980-3a8d-48bb-9bf2-70309e50f76f,network=Network(1d7ca0ab-f499-4866-82d7-e753ea2e04cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0c051980-3a')
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.532 186985 INFO nova.virt.libvirt.driver [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Deleting instance files /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183_del
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.533 186985 INFO nova.virt.libvirt.driver [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Deletion of /var/lib/nova/instances/230dfa9a-6ddb-49a0-90dd-c4b6b4c47183_del complete
Nov 22 10:10:05 compute-0 podman[217752]: 2025-11-22 10:10:05.566692034 +0000 UTC m=+0.041528866 container remove 96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.571 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[c49653b7-31fd-4ed7-9352-9f38de79c916]: (4, ('Sat Nov 22 10:10:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb (96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311)\n96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311\nSat Nov 22 10:10:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb (96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311)\n96106e681f03de6b321cb849fd69b3c8f437354e52412ba76605abbc4e598311\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.572 186985 INFO nova.compute.manager [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.573 186985 DEBUG oslo.service.loopingcall [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.573 186985 DEBUG nova.compute.manager [-] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.574 186985 DEBUG nova.network.neutron [-] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.574 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1c5d0bc2-dbbf-461c-b474-edd34146f8d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.576 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d7ca0ab-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.578 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 kernel: tap1d7ca0ab-f0: left promiscuous mode
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.647 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.651 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[eb82ab29-ea73-4e86-adbb-34ce4d3391a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.662 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7bde8c82-409d-4900-825e-694463bc040f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.664 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[6e198af8-fc6b-416e-9d6a-4e560a710911]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 nova_compute[186981]: 2025-11-22 10:10:05.673 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.677 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[a58d4d8e-7046-477a-ba7c-d2b674161c1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365869, 'reachable_time': 39539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217767, 'error': None, 'target': 'ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.680 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1d7ca0ab-f499-4866-82d7-e753ea2e04cb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:10:05 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:05.680 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[24a98d35-56c1-4a9f-9e04-cbd8ca57424d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d1d7ca0ab\x2df499\x2d4866\x2d82d7\x2de753ea2e04cb.mount: Deactivated successfully.
Nov 22 10:10:05 compute-0 podman[217768]: 2025-11-22 10:10:05.799289745 +0000 UTC m=+0.079028028 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 10:10:05 compute-0 podman[217769]: 2025-11-22 10:10:05.825400429 +0000 UTC m=+0.105057330 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 10:10:06 compute-0 nova_compute[186981]: 2025-11-22 10:10:06.179 186985 DEBUG nova.compute.manager [req-2b577707-e770-4b75-bbeb-bb60fd296f0c req-e0b479b2-60ea-474c-9f09-3c320e374bf3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received event network-vif-unplugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:06 compute-0 nova_compute[186981]: 2025-11-22 10:10:06.181 186985 DEBUG oslo_concurrency.lockutils [req-2b577707-e770-4b75-bbeb-bb60fd296f0c req-e0b479b2-60ea-474c-9f09-3c320e374bf3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:06 compute-0 nova_compute[186981]: 2025-11-22 10:10:06.182 186985 DEBUG oslo_concurrency.lockutils [req-2b577707-e770-4b75-bbeb-bb60fd296f0c req-e0b479b2-60ea-474c-9f09-3c320e374bf3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:06 compute-0 nova_compute[186981]: 2025-11-22 10:10:06.182 186985 DEBUG oslo_concurrency.lockutils [req-2b577707-e770-4b75-bbeb-bb60fd296f0c req-e0b479b2-60ea-474c-9f09-3c320e374bf3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:06 compute-0 nova_compute[186981]: 2025-11-22 10:10:06.182 186985 DEBUG nova.compute.manager [req-2b577707-e770-4b75-bbeb-bb60fd296f0c req-e0b479b2-60ea-474c-9f09-3c320e374bf3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] No waiting events found dispatching network-vif-unplugged-0c051980-3a8d-48bb-9bf2-70309e50f76f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:10:06 compute-0 nova_compute[186981]: 2025-11-22 10:10:06.183 186985 DEBUG nova.compute.manager [req-2b577707-e770-4b75-bbeb-bb60fd296f0c req-e0b479b2-60ea-474c-9f09-3c320e374bf3 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received event network-vif-unplugged-0c051980-3a8d-48bb-9bf2-70309e50f76f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.275 186985 DEBUG nova.network.neutron [-] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.294 186985 INFO nova.compute.manager [-] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Took 1.72 seconds to deallocate network for instance.
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.333 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.334 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.395 186985 DEBUG nova.compute.provider_tree [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.411 186985 DEBUG nova.scheduler.client.report [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.433 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.466 186985 INFO nova.scheduler.client.report [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183
Nov 22 10:10:07 compute-0 nova_compute[186981]: 2025-11-22 10:10:07.521 186985 DEBUG oslo_concurrency.lockutils [None req-68258af7-f7d0-42bc-b0ec-ef072d6e296c fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:08 compute-0 nova_compute[186981]: 2025-11-22 10:10:08.261 186985 DEBUG nova.compute.manager [req-77e5549f-0b03-4021-b34a-ea166195dc59 req-39a4d94b-4aba-4a57-a0bf-2ce4083833ac 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:08 compute-0 nova_compute[186981]: 2025-11-22 10:10:08.261 186985 DEBUG oslo_concurrency.lockutils [req-77e5549f-0b03-4021-b34a-ea166195dc59 req-39a4d94b-4aba-4a57-a0bf-2ce4083833ac 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:08 compute-0 nova_compute[186981]: 2025-11-22 10:10:08.262 186985 DEBUG oslo_concurrency.lockutils [req-77e5549f-0b03-4021-b34a-ea166195dc59 req-39a4d94b-4aba-4a57-a0bf-2ce4083833ac 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:08 compute-0 nova_compute[186981]: 2025-11-22 10:10:08.263 186985 DEBUG oslo_concurrency.lockutils [req-77e5549f-0b03-4021-b34a-ea166195dc59 req-39a4d94b-4aba-4a57-a0bf-2ce4083833ac 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "230dfa9a-6ddb-49a0-90dd-c4b6b4c47183-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:08 compute-0 nova_compute[186981]: 2025-11-22 10:10:08.263 186985 DEBUG nova.compute.manager [req-77e5549f-0b03-4021-b34a-ea166195dc59 req-39a4d94b-4aba-4a57-a0bf-2ce4083833ac 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] No waiting events found dispatching network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:10:08 compute-0 nova_compute[186981]: 2025-11-22 10:10:08.264 186985 WARNING nova.compute.manager [req-77e5549f-0b03-4021-b34a-ea166195dc59 req-39a4d94b-4aba-4a57-a0bf-2ce4083833ac 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Received unexpected event network-vif-plugged-0c051980-3a8d-48bb-9bf2-70309e50f76f for instance with vm_state deleted and task_state None.
Nov 22 10:10:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:09.390 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:10:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:09.392 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:10:09 compute-0 nova_compute[186981]: 2025-11-22 10:10:09.394 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:10 compute-0 nova_compute[186981]: 2025-11-22 10:10:10.532 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:10 compute-0 nova_compute[186981]: 2025-11-22 10:10:10.675 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:12 compute-0 podman[217811]: 2025-11-22 10:10:12.627715812 +0000 UTC m=+0.077260121 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 10:10:12 compute-0 podman[217812]: 2025-11-22 10:10:12.671060885 +0000 UTC m=+0.115949617 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 10:10:15 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:15.395 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:15 compute-0 nova_compute[186981]: 2025-11-22 10:10:15.534 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:15 compute-0 nova_compute[186981]: 2025-11-22 10:10:15.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:15 compute-0 nova_compute[186981]: 2025-11-22 10:10:15.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:10:15 compute-0 nova_compute[186981]: 2025-11-22 10:10:15.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:10:15 compute-0 nova_compute[186981]: 2025-11-22 10:10:15.619 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:10:15 compute-0 nova_compute[186981]: 2025-11-22 10:10:15.682 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:16 compute-0 podman[217853]: 2025-11-22 10:10:16.62193157 +0000 UTC m=+0.068441201 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.622 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.623 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.623 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.623 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:10:16 compute-0 podman[217852]: 2025-11-22 10:10:16.625937229 +0000 UTC m=+0.075126583 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.780 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.781 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.45872497558594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.782 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.782 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.835 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.836 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:10:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.856 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.868 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.918 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:10:16 compute-0 nova_compute[186981]: 2025-11-22 10:10:16.918 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:17 compute-0 nova_compute[186981]: 2025-11-22 10:10:17.918 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:17 compute-0 nova_compute[186981]: 2025-11-22 10:10:17.919 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:17 compute-0 nova_compute[186981]: 2025-11-22 10:10:17.919 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:17 compute-0 nova_compute[186981]: 2025-11-22 10:10:17.919 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:17 compute-0 nova_compute[186981]: 2025-11-22 10:10:17.919 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:10:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:17.938 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:17.939 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:17.939 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:18 compute-0 nova_compute[186981]: 2025-11-22 10:10:18.480 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:18 compute-0 nova_compute[186981]: 2025-11-22 10:10:18.545 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:19 compute-0 nova_compute[186981]: 2025-11-22 10:10:19.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:20 compute-0 nova_compute[186981]: 2025-11-22 10:10:20.511 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806205.510032, 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:20 compute-0 nova_compute[186981]: 2025-11-22 10:10:20.512 186985 INFO nova.compute.manager [-] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] VM Stopped (Lifecycle Event)
Nov 22 10:10:20 compute-0 nova_compute[186981]: 2025-11-22 10:10:20.533 186985 DEBUG nova.compute.manager [None req-86bed7d3-701e-45e8-9c5d-e38aeac7005c - - - - - -] [instance: 230dfa9a-6ddb-49a0-90dd-c4b6b4c47183] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:20 compute-0 nova_compute[186981]: 2025-11-22 10:10:20.584 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:20 compute-0 nova_compute[186981]: 2025-11-22 10:10:20.683 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:23 compute-0 nova_compute[186981]: 2025-11-22 10:10:23.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:10:25 compute-0 nova_compute[186981]: 2025-11-22 10:10:25.586 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:25 compute-0 nova_compute[186981]: 2025-11-22 10:10:25.686 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:27 compute-0 podman[217898]: 2025-11-22 10:10:27.600414521 +0000 UTC m=+0.056932581 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:10:30 compute-0 nova_compute[186981]: 2025-11-22 10:10:30.589 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:30 compute-0 nova_compute[186981]: 2025-11-22 10:10:30.742 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:35 compute-0 nova_compute[186981]: 2025-11-22 10:10:35.591 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:35 compute-0 nova_compute[186981]: 2025-11-22 10:10:35.744 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:36 compute-0 podman[217922]: 2025-11-22 10:10:36.609126943 +0000 UTC m=+0.058739301 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:10:36 compute-0 podman[217923]: 2025-11-22 10:10:36.638263127 +0000 UTC m=+0.087414252 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.204 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.205 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.340 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.493 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.494 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.500 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.501 186985 INFO nova.compute.claims [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.786 186985 DEBUG nova.compute.provider_tree [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:10:39 compute-0 nova_compute[186981]: 2025-11-22 10:10:39.858 186985 DEBUG nova.scheduler.client.report [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.046 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.047 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.244 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.245 186985 DEBUG nova.network.neutron [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.446 186985 INFO nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.477 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.593 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.733 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.736 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.737 186985 INFO nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Creating image(s)
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.738 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.738 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.740 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.763 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.767 186985 DEBUG nova.policy [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.772 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.826 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.828 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.829 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.853 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.919 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.921 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.955 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.956 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:40 compute-0 nova_compute[186981]: 2025-11-22 10:10:40.956 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.009 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.011 186985 DEBUG nova.virt.disk.api [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.011 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.064 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.065 186985 DEBUG nova.virt.disk.api [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.065 186985 DEBUG nova.objects.instance [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 54acdc4c-5722-41ef-992f-2ac15ae8fdf9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.084 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.084 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Ensure instance console log exists: /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.085 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.085 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:41 compute-0 nova_compute[186981]: 2025-11-22 10:10:41.086 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:42 compute-0 nova_compute[186981]: 2025-11-22 10:10:42.504 186985 DEBUG nova.network.neutron [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Successfully created port: f6334999-5b68-4a4f-b5e6-f5660d06217d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.345 186985 DEBUG nova.network.neutron [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Successfully updated port: f6334999-5b68-4a4f-b5e6-f5660d06217d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.464 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.464 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.464 186985 DEBUG nova.network.neutron [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.472 186985 DEBUG nova.compute.manager [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-changed-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.472 186985 DEBUG nova.compute.manager [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Refreshing instance network info cache due to event network-changed-f6334999-5b68-4a4f-b5e6-f5660d06217d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.473 186985 DEBUG oslo_concurrency.lockutils [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:10:43 compute-0 podman[217983]: 2025-11-22 10:10:43.6104865 +0000 UTC m=+0.059000648 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 10:10:43 compute-0 nova_compute[186981]: 2025-11-22 10:10:43.618 186985 DEBUG nova.network.neutron [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:10:43 compute-0 podman[217982]: 2025-11-22 10:10:43.631366519 +0000 UTC m=+0.085241022 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.374 186985 DEBUG nova.network.neutron [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updating instance_info_cache with network_info: [{"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.699 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.699 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Instance network_info: |[{"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.700 186985 DEBUG oslo_concurrency.lockutils [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.700 186985 DEBUG nova.network.neutron [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Refreshing network info cache for port f6334999-5b68-4a4f-b5e6-f5660d06217d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.704 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Start _get_guest_xml network_info=[{"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.710 186985 WARNING nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.714 186985 DEBUG nova.virt.libvirt.host [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.715 186985 DEBUG nova.virt.libvirt.host [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.718 186985 DEBUG nova.virt.libvirt.host [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.719 186985 DEBUG nova.virt.libvirt.host [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.719 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.719 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.720 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.720 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.720 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.720 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.721 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.721 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.721 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.721 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.721 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.722 186985 DEBUG nova.virt.hardware [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.725 186985 DEBUG nova.virt.libvirt.vif [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:10:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1421511268',display_name='tempest-TestNetworkBasicOps-server-1421511268',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1421511268',id=10,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA9T2G06T2BCXLs/9ZpLN3mYr/2guxx0KOElmDRx5afxifpsu20rAdPd4EllbcoN+zO0h5CQJ9qtDrfgra/I8Ic4GCqHGvLiFSpQZKVhq1MDN8kQYsyCqgHTZo/IwHqQsw==',key_name='tempest-TestNetworkBasicOps-1455852201',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-nmrof0gi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:10:40Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=54acdc4c-5722-41ef-992f-2ac15ae8fdf9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.725 186985 DEBUG nova.network.os_vif_util [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.726 186985 DEBUG nova.network.os_vif_util [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:f6:df,bridge_name='br-int',has_traffic_filtering=True,id=f6334999-5b68-4a4f-b5e6-f5660d06217d,network=Network(db7a0571-c9ff-4ba8-85be-9f66260c9300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6334999-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.727 186985 DEBUG nova.objects.instance [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 54acdc4c-5722-41ef-992f-2ac15ae8fdf9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.939 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <uuid>54acdc4c-5722-41ef-992f-2ac15ae8fdf9</uuid>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <name>instance-0000000a</name>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-1421511268</nova:name>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:10:44</nova:creationTime>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         <nova:port uuid="f6334999-5b68-4a4f-b5e6-f5660d06217d">
Nov 22 10:10:44 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <system>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <entry name="serial">54acdc4c-5722-41ef-992f-2ac15ae8fdf9</entry>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <entry name="uuid">54acdc4c-5722-41ef-992f-2ac15ae8fdf9</entry>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </system>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <os>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   </os>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <features>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   </features>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk.config"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:aa:f6:df"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <target dev="tapf6334999-5b"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/console.log" append="off"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <video>
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </video>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:10:44 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:10:44 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:10:44 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:10:44 compute-0 nova_compute[186981]: </domain>
Nov 22 10:10:44 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.941 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Preparing to wait for external event network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.941 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.941 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.941 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.942 186985 DEBUG nova.virt.libvirt.vif [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:10:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1421511268',display_name='tempest-TestNetworkBasicOps-server-1421511268',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1421511268',id=10,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA9T2G06T2BCXLs/9ZpLN3mYr/2guxx0KOElmDRx5afxifpsu20rAdPd4EllbcoN+zO0h5CQJ9qtDrfgra/I8Ic4GCqHGvLiFSpQZKVhq1MDN8kQYsyCqgHTZo/IwHqQsw==',key_name='tempest-TestNetworkBasicOps-1455852201',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-nmrof0gi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:10:40Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=54acdc4c-5722-41ef-992f-2ac15ae8fdf9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.942 186985 DEBUG nova.network.os_vif_util [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.943 186985 DEBUG nova.network.os_vif_util [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:f6:df,bridge_name='br-int',has_traffic_filtering=True,id=f6334999-5b68-4a4f-b5e6-f5660d06217d,network=Network(db7a0571-c9ff-4ba8-85be-9f66260c9300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6334999-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.943 186985 DEBUG os_vif [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:f6:df,bridge_name='br-int',has_traffic_filtering=True,id=f6334999-5b68-4a4f-b5e6-f5660d06217d,network=Network(db7a0571-c9ff-4ba8-85be-9f66260c9300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6334999-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.943 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.944 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.944 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.947 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.948 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6334999-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.948 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6334999-5b, col_values=(('external_ids', {'iface-id': 'f6334999-5b68-4a4f-b5e6-f5660d06217d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:f6:df', 'vm-uuid': '54acdc4c-5722-41ef-992f-2ac15ae8fdf9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.949 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.950 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:44 compute-0 NetworkManager[55425]: <info>  [1763806244.9518] manager: (tapf6334999-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.952 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.962 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:44 compute-0 nova_compute[186981]: 2025-11-22 10:10:44.962 186985 INFO os_vif [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:f6:df,bridge_name='br-int',has_traffic_filtering=True,id=f6334999-5b68-4a4f-b5e6-f5660d06217d,network=Network(db7a0571-c9ff-4ba8-85be-9f66260c9300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6334999-5b')
Nov 22 10:10:45 compute-0 nova_compute[186981]: 2025-11-22 10:10:45.139 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:10:45 compute-0 nova_compute[186981]: 2025-11-22 10:10:45.140 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:10:45 compute-0 nova_compute[186981]: 2025-11-22 10:10:45.140 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:aa:f6:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:10:45 compute-0 nova_compute[186981]: 2025-11-22 10:10:45.141 186985 INFO nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Using config drive
Nov 22 10:10:45 compute-0 nova_compute[186981]: 2025-11-22 10:10:45.748 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.161 186985 INFO nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Creating config drive at /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk.config
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.169 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp5fst2cl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.296 186985 DEBUG oslo_concurrency.processutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp5fst2cl" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:10:46 compute-0 kernel: tapf6334999-5b: entered promiscuous mode
Nov 22 10:10:46 compute-0 NetworkManager[55425]: <info>  [1763806246.3556] manager: (tapf6334999-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Nov 22 10:10:46 compute-0 ovn_controller[95329]: 2025-11-22T10:10:46Z|00134|binding|INFO|Claiming lport f6334999-5b68-4a4f-b5e6-f5660d06217d for this chassis.
Nov 22 10:10:46 compute-0 ovn_controller[95329]: 2025-11-22T10:10:46Z|00135|binding|INFO|f6334999-5b68-4a4f-b5e6-f5660d06217d: Claiming fa:16:3e:aa:f6:df 10.100.0.11
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.354 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 systemd-machined[153303]: New machine qemu-10-instance-0000000a.
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.408 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.411 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.413 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:f6:df 10.100.0.11'], port_security=['fa:16:3e:aa:f6:df 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '54acdc4c-5722-41ef-992f-2ac15ae8fdf9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '549b16e8-b02c-4f0b-8d1f-6217427613af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c4ac0fd-b593-43d9-ad20-afeaf5a5781e, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=f6334999-5b68-4a4f-b5e6-f5660d06217d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:10:46 compute-0 ovn_controller[95329]: 2025-11-22T10:10:46Z|00136|binding|INFO|Setting lport f6334999-5b68-4a4f-b5e6-f5660d06217d ovn-installed in OVS
Nov 22 10:10:46 compute-0 ovn_controller[95329]: 2025-11-22T10:10:46Z|00137|binding|INFO|Setting lport f6334999-5b68-4a4f-b5e6-f5660d06217d up in Southbound
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.415 104216 INFO neutron.agent.ovn.metadata.agent [-] Port f6334999-5b68-4a4f-b5e6-f5660d06217d in datapath db7a0571-c9ff-4ba8-85be-9f66260c9300 bound to our chassis
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.416 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network db7a0571-c9ff-4ba8-85be-9f66260c9300
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.415 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.432 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9422d9-79fd-483c-b397-030c6a886ec5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.433 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdb7a0571-c1 in ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:10:46 compute-0 systemd-udevd[218039]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.436 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdb7a0571-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.436 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[693bfa41-39d0-4910-9bc8-2bafd0792d72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.437 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8d6bb0c1-fd0b-4f60-b318-c2cb5d6f7126]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 NetworkManager[55425]: <info>  [1763806246.4496] device (tapf6334999-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.449 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[30ec5dda-738a-4938-9e92-2d21f2cb313c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 NetworkManager[55425]: <info>  [1763806246.4510] device (tapf6334999-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.466 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[db9204f6-c7a6-4aaf-a7c6-d2bdfe1ee684]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.498 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[84d9f078-d046-46b6-bd41-891520a05414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.503 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[94f909d4-5475-447f-af7e-22284f428db9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 NetworkManager[55425]: <info>  [1763806246.5049] manager: (tapdb7a0571-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Nov 22 10:10:46 compute-0 systemd-udevd[218046]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.534 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[8a7b2edf-27a4-4de4-b02c-3ad415f36d47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.538 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2cd612-f01e-4dc3-a03c-1a75e90547cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 NetworkManager[55425]: <info>  [1763806246.5611] device (tapdb7a0571-c0): carrier: link connected
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.566 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7f7c82-66a2-4239-afe3-bc9d3af089c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.586 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2137c18a-0b53-474d-b4cc-5af23b633045]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb7a0571-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:67:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370426, 'reachable_time': 23733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218071, 'error': None, 'target': 'ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.602 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[f69b2ae1-1ff3-4e34-83ef-381901807efc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:677c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 370426, 'tstamp': 370426}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218073, 'error': None, 'target': 'ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.622 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[b8416845-5c4f-400e-9431-47df4d9df618]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb7a0571-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:67:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370426, 'reachable_time': 23733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218078, 'error': None, 'target': 'ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.652 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9aeb01-fca0-4648-bb8a-64e1cd0f8960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.707 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806246.707182, 54acdc4c-5722-41ef-992f-2ac15ae8fdf9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.708 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] VM Started (Lifecycle Event)
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.714 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7f34e140-a214-4eb4-a8df-cf956f7a3c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.715 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb7a0571-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.716 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.716 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb7a0571-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:46 compute-0 NetworkManager[55425]: <info>  [1763806246.7638] manager: (tapdb7a0571-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Nov 22 10:10:46 compute-0 kernel: tapdb7a0571-c0: entered promiscuous mode
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.764 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.767 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdb7a0571-c0, col_values=(('external_ids', {'iface-id': '3a810d19-ffdb-4276-94db-cf989ba2ac19'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.766 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.768 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 ovn_controller[95329]: 2025-11-22T10:10:46Z|00138|binding|INFO|Releasing lport 3a810d19-ffdb-4276-94db-cf989ba2ac19 from this chassis (sb_readonly=0)
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.779 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.780 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.780 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/db7a0571-c9ff-4ba8-85be-9f66260c9300.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/db7a0571-c9ff-4ba8-85be-9f66260c9300.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.781 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[0809902a-f4b3-4cf7-bf9a-2fbd8f63bbaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.782 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-db7a0571-c9ff-4ba8-85be-9f66260c9300
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/db7a0571-c9ff-4ba8-85be-9f66260c9300.pid.haproxy
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID db7a0571-c9ff-4ba8-85be-9f66260c9300
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:10:46 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:10:46.783 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'env', 'PROCESS_TAG=haproxy-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/db7a0571-c9ff-4ba8-85be-9f66260c9300.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.806 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.811 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806246.7072985, 54acdc4c-5722-41ef-992f-2ac15ae8fdf9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:46 compute-0 nova_compute[186981]: 2025-11-22 10:10:46.811 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] VM Paused (Lifecycle Event)
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.147 186985 DEBUG nova.compute.manager [req-66bca67a-74ff-4d2a-8229-ca4d809ae07d req-6e7c7056-e0f3-477e-9199-702666c36c66 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.148 186985 DEBUG oslo_concurrency.lockutils [req-66bca67a-74ff-4d2a-8229-ca4d809ae07d req-6e7c7056-e0f3-477e-9199-702666c36c66 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.148 186985 DEBUG oslo_concurrency.lockutils [req-66bca67a-74ff-4d2a-8229-ca4d809ae07d req-6e7c7056-e0f3-477e-9199-702666c36c66 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.148 186985 DEBUG oslo_concurrency.lockutils [req-66bca67a-74ff-4d2a-8229-ca4d809ae07d req-6e7c7056-e0f3-477e-9199-702666c36c66 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.149 186985 DEBUG nova.compute.manager [req-66bca67a-74ff-4d2a-8229-ca4d809ae07d req-6e7c7056-e0f3-477e-9199-702666c36c66 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Processing event network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.150 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.157 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.160 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.163 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806247.15329, 54acdc4c-5722-41ef-992f-2ac15ae8fdf9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.163 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] VM Resumed (Lifecycle Event)
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.165 186985 INFO nova.virt.libvirt.driver [-] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Instance spawned successfully.
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.166 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:10:47 compute-0 podman[218112]: 2025-11-22 10:10:47.182675666 +0000 UTC m=+0.058702519 container create f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.190 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.194 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.204 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.205 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.205 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.206 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.207 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.208 186985 DEBUG nova.virt.libvirt.driver [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:10:47 compute-0 systemd[1]: Started libpod-conmon-f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098.scope.
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.223 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:10:47 compute-0 podman[218112]: 2025-11-22 10:10:47.15126604 +0000 UTC m=+0.027292893 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:10:47 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:10:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/776e7d03cd70fb2407432a95354ffee0e90c61e3dd979ff25cecbc3c860d7ff8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:10:47 compute-0 podman[218112]: 2025-11-22 10:10:47.268541554 +0000 UTC m=+0.144568417 container init f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:10:47 compute-0 podman[218112]: 2025-11-22 10:10:47.27610129 +0000 UTC m=+0.152128163 container start f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:10:47 compute-0 neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300[218129]: [NOTICE]   (218155) : New worker (218175) forked
Nov 22 10:10:47 compute-0 neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300[218129]: [NOTICE]   (218155) : Loading success.
Nov 22 10:10:47 compute-0 podman[218128]: 2025-11-22 10:10:47.322311209 +0000 UTC m=+0.094379111 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 10:10:47 compute-0 podman[218125]: 2025-11-22 10:10:47.335554279 +0000 UTC m=+0.105871234 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.370 186985 INFO nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Took 6.64 seconds to spawn the instance on the hypervisor.
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.370 186985 DEBUG nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.568 186985 INFO nova.compute.manager [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Took 8.10 seconds to build instance.
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.689 186985 DEBUG oslo_concurrency.lockutils [None req-acb73814-47f2-4d06-959d-a17bd021cf50 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.819 186985 DEBUG nova.network.neutron [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updated VIF entry in instance network info cache for port f6334999-5b68-4a4f-b5e6-f5660d06217d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.819 186985 DEBUG nova.network.neutron [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updating instance_info_cache with network_info: [{"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:10:47 compute-0 nova_compute[186981]: 2025-11-22 10:10:47.886 186985 DEBUG oslo_concurrency.lockutils [req-e40298e6-e04a-4418-b265-c0e79ba1e512 req-df0014c6-dff9-4b6c-bc86-0ddb450e2629 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:10:49 compute-0 nova_compute[186981]: 2025-11-22 10:10:49.256 186985 DEBUG nova.compute.manager [req-abcd8b57-5e6c-4ad2-ae6b-80353c692720 req-6fee1da1-9a8d-432b-9b93-349d078b2f1a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:49 compute-0 nova_compute[186981]: 2025-11-22 10:10:49.257 186985 DEBUG oslo_concurrency.lockutils [req-abcd8b57-5e6c-4ad2-ae6b-80353c692720 req-6fee1da1-9a8d-432b-9b93-349d078b2f1a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:10:49 compute-0 nova_compute[186981]: 2025-11-22 10:10:49.257 186985 DEBUG oslo_concurrency.lockutils [req-abcd8b57-5e6c-4ad2-ae6b-80353c692720 req-6fee1da1-9a8d-432b-9b93-349d078b2f1a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:10:49 compute-0 nova_compute[186981]: 2025-11-22 10:10:49.257 186985 DEBUG oslo_concurrency.lockutils [req-abcd8b57-5e6c-4ad2-ae6b-80353c692720 req-6fee1da1-9a8d-432b-9b93-349d078b2f1a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:10:49 compute-0 nova_compute[186981]: 2025-11-22 10:10:49.258 186985 DEBUG nova.compute.manager [req-abcd8b57-5e6c-4ad2-ae6b-80353c692720 req-6fee1da1-9a8d-432b-9b93-349d078b2f1a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] No waiting events found dispatching network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:10:49 compute-0 nova_compute[186981]: 2025-11-22 10:10:49.258 186985 WARNING nova.compute.manager [req-abcd8b57-5e6c-4ad2-ae6b-80353c692720 req-6fee1da1-9a8d-432b-9b93-349d078b2f1a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received unexpected event network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d for instance with vm_state active and task_state None.
Nov 22 10:10:49 compute-0 nova_compute[186981]: 2025-11-22 10:10:49.951 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:50 compute-0 nova_compute[186981]: 2025-11-22 10:10:50.787 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.088 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:51 compute-0 NetworkManager[55425]: <info>  [1763806251.0894] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 22 10:10:51 compute-0 NetworkManager[55425]: <info>  [1763806251.0902] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 22 10:10:51 compute-0 ovn_controller[95329]: 2025-11-22T10:10:51Z|00139|binding|INFO|Releasing lport 3a810d19-ffdb-4276-94db-cf989ba2ac19 from this chassis (sb_readonly=0)
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.116 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:51 compute-0 ovn_controller[95329]: 2025-11-22T10:10:51Z|00140|binding|INFO|Releasing lport 3a810d19-ffdb-4276-94db-cf989ba2ac19 from this chassis (sb_readonly=0)
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.126 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.426 186985 DEBUG nova.compute.manager [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-changed-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.427 186985 DEBUG nova.compute.manager [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Refreshing instance network info cache due to event network-changed-f6334999-5b68-4a4f-b5e6-f5660d06217d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.427 186985 DEBUG oslo_concurrency.lockutils [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.428 186985 DEBUG oslo_concurrency.lockutils [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:10:51 compute-0 nova_compute[186981]: 2025-11-22 10:10:51.428 186985 DEBUG nova.network.neutron [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Refreshing network info cache for port f6334999-5b68-4a4f-b5e6-f5660d06217d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:10:54 compute-0 nova_compute[186981]: 2025-11-22 10:10:54.953 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:55 compute-0 nova_compute[186981]: 2025-11-22 10:10:55.731 186985 DEBUG nova.network.neutron [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updated VIF entry in instance network info cache for port f6334999-5b68-4a4f-b5e6-f5660d06217d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:10:55 compute-0 nova_compute[186981]: 2025-11-22 10:10:55.732 186985 DEBUG nova.network.neutron [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updating instance_info_cache with network_info: [{"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:10:55 compute-0 nova_compute[186981]: 2025-11-22 10:10:55.762 186985 DEBUG oslo_concurrency.lockutils [req-11a505f2-f76c-4f98-8c69-67ddd2833db7 req-df89c1fc-f1bd-4dd1-a392-55a4f07e8eb9 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:10:55 compute-0 nova_compute[186981]: 2025-11-22 10:10:55.817 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:10:58 compute-0 ovn_controller[95329]: 2025-11-22T10:10:58Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:aa:f6:df 10.100.0.11
Nov 22 10:10:58 compute-0 ovn_controller[95329]: 2025-11-22T10:10:58Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:aa:f6:df 10.100.0.11
Nov 22 10:10:58 compute-0 podman[218200]: 2025-11-22 10:10:58.613212708 +0000 UTC m=+0.067428597 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:10:59 compute-0 nova_compute[186981]: 2025-11-22 10:10:59.956 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:00 compute-0 nova_compute[186981]: 2025-11-22 10:11:00.866 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:03 compute-0 nova_compute[186981]: 2025-11-22 10:11:03.920 186985 INFO nova.compute.manager [None req-85fda720-60dd-4b05-abf4-049e1fd59586 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Get console output
Nov 22 10:11:03 compute-0 nova_compute[186981]: 2025-11-22 10:11:03.929 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:11:04 compute-0 ovn_controller[95329]: 2025-11-22T10:11:04Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:aa:f6:df 10.100.0.11
Nov 22 10:11:04 compute-0 nova_compute[186981]: 2025-11-22 10:11:04.957 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:05 compute-0 nova_compute[186981]: 2025-11-22 10:11:05.931 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:07 compute-0 ovn_controller[95329]: 2025-11-22T10:11:07Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:aa:f6:df 10.100.0.11
Nov 22 10:11:07 compute-0 podman[218224]: 2025-11-22 10:11:07.617328355 +0000 UTC m=+0.071258451 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:11:07 compute-0 podman[218225]: 2025-11-22 10:11:07.65310058 +0000 UTC m=+0.102947585 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.745 186985 DEBUG nova.compute.manager [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-changed-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.746 186985 DEBUG nova.compute.manager [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Refreshing instance network info cache due to event network-changed-f6334999-5b68-4a4f-b5e6-f5660d06217d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.748 186985 DEBUG oslo_concurrency.lockutils [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.748 186985 DEBUG oslo_concurrency.lockutils [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.748 186985 DEBUG nova.network.neutron [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Refreshing network info cache for port f6334999-5b68-4a4f-b5e6-f5660d06217d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.808 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.809 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.810 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:09.810 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.811 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.811 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:09.811 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.813 186985 INFO nova.compute.manager [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Terminating instance
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.815 186985 DEBUG nova.compute.manager [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.816 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:09 compute-0 kernel: tapf6334999-5b (unregistering): left promiscuous mode
Nov 22 10:11:09 compute-0 NetworkManager[55425]: <info>  [1763806269.8924] device (tapf6334999-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.906 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:09 compute-0 ovn_controller[95329]: 2025-11-22T10:11:09Z|00141|binding|INFO|Releasing lport f6334999-5b68-4a4f-b5e6-f5660d06217d from this chassis (sb_readonly=0)
Nov 22 10:11:09 compute-0 ovn_controller[95329]: 2025-11-22T10:11:09Z|00142|binding|INFO|Setting lport f6334999-5b68-4a4f-b5e6-f5660d06217d down in Southbound
Nov 22 10:11:09 compute-0 ovn_controller[95329]: 2025-11-22T10:11:09Z|00143|binding|INFO|Removing iface tapf6334999-5b ovn-installed in OVS
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.910 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:09.919 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:f6:df 10.100.0.11'], port_security=['fa:16:3e:aa:f6:df 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '54acdc4c-5722-41ef-992f-2ac15ae8fdf9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '549b16e8-b02c-4f0b-8d1f-6217427613af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c4ac0fd-b593-43d9-ad20-afeaf5a5781e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=f6334999-5b68-4a4f-b5e6-f5660d06217d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:11:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:09.922 104216 INFO neutron.agent.ovn.metadata.agent [-] Port f6334999-5b68-4a4f-b5e6-f5660d06217d in datapath db7a0571-c9ff-4ba8-85be-9f66260c9300 unbound from our chassis
Nov 22 10:11:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:09.924 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db7a0571-c9ff-4ba8-85be-9f66260c9300, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:11:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:09.926 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e03eaa-9cec-4856-b772-cf38ba441a87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:09 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:09.927 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300 namespace which is not needed anymore
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.946 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:09 compute-0 nova_compute[186981]: 2025-11-22 10:11:09.959 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:09 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 22 10:11:09 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 12.808s CPU time.
Nov 22 10:11:09 compute-0 systemd-machined[153303]: Machine qemu-10-instance-0000000a terminated.
Nov 22 10:11:10 compute-0 neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300[218129]: [NOTICE]   (218155) : haproxy version is 2.8.14-c23fe91
Nov 22 10:11:10 compute-0 neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300[218129]: [NOTICE]   (218155) : path to executable is /usr/sbin/haproxy
Nov 22 10:11:10 compute-0 neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300[218129]: [WARNING]  (218155) : Exiting Master process...
Nov 22 10:11:10 compute-0 neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300[218129]: [ALERT]    (218155) : Current worker (218175) exited with code 143 (Terminated)
Nov 22 10:11:10 compute-0 neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300[218129]: [WARNING]  (218155) : All workers exited. Exiting... (0)
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.095 186985 INFO nova.virt.libvirt.driver [-] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Instance destroyed successfully.
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.095 186985 DEBUG nova.objects.instance [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 54acdc4c-5722-41ef-992f-2ac15ae8fdf9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:11:10 compute-0 systemd[1]: libpod-f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098.scope: Deactivated successfully.
Nov 22 10:11:10 compute-0 podman[218295]: 2025-11-22 10:11:10.104810714 +0000 UTC m=+0.058692940 container died f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.109 186985 DEBUG nova.virt.libvirt.vif [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:10:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1421511268',display_name='tempest-TestNetworkBasicOps-server-1421511268',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1421511268',id=10,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA9T2G06T2BCXLs/9ZpLN3mYr/2guxx0KOElmDRx5afxifpsu20rAdPd4EllbcoN+zO0h5CQJ9qtDrfgra/I8Ic4GCqHGvLiFSpQZKVhq1MDN8kQYsyCqgHTZo/IwHqQsw==',key_name='tempest-TestNetworkBasicOps-1455852201',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:10:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-nmrof0gi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:10:47Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=54acdc4c-5722-41ef-992f-2ac15ae8fdf9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.109 186985 DEBUG nova.network.os_vif_util [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.110 186985 DEBUG nova.network.os_vif_util [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:aa:f6:df,bridge_name='br-int',has_traffic_filtering=True,id=f6334999-5b68-4a4f-b5e6-f5660d06217d,network=Network(db7a0571-c9ff-4ba8-85be-9f66260c9300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6334999-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.110 186985 DEBUG os_vif [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:f6:df,bridge_name='br-int',has_traffic_filtering=True,id=f6334999-5b68-4a4f-b5e6-f5660d06217d,network=Network(db7a0571-c9ff-4ba8-85be-9f66260c9300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6334999-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.112 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.112 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6334999-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.114 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.116 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.118 186985 INFO os_vif [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:f6:df,bridge_name='br-int',has_traffic_filtering=True,id=f6334999-5b68-4a4f-b5e6-f5660d06217d,network=Network(db7a0571-c9ff-4ba8-85be-9f66260c9300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6334999-5b')
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.118 186985 INFO nova.virt.libvirt.driver [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Deleting instance files /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9_del
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.119 186985 INFO nova.virt.libvirt.driver [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Deletion of /var/lib/nova/instances/54acdc4c-5722-41ef-992f-2ac15ae8fdf9_del complete
Nov 22 10:11:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098-userdata-shm.mount: Deactivated successfully.
Nov 22 10:11:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-776e7d03cd70fb2407432a95354ffee0e90c61e3dd979ff25cecbc3c860d7ff8-merged.mount: Deactivated successfully.
Nov 22 10:11:10 compute-0 podman[218295]: 2025-11-22 10:11:10.149144101 +0000 UTC m=+0.103026297 container cleanup f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.175 186985 INFO nova.compute.manager [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.175 186985 DEBUG oslo.service.loopingcall [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.176 186985 DEBUG nova.compute.manager [-] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.176 186985 DEBUG nova.network.neutron [-] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:11:10 compute-0 systemd[1]: libpod-conmon-f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098.scope: Deactivated successfully.
Nov 22 10:11:10 compute-0 podman[218338]: 2025-11-22 10:11:10.20825147 +0000 UTC m=+0.041743887 container remove f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.213 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[dabbcfac-7a73-485c-a4c5-0fef16849bab]: (4, ('Sat Nov 22 10:11:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300 (f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098)\nf269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098\nSat Nov 22 10:11:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300 (f269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098)\nf269a8ed68bcfe0c4dcdd4545c4346009dfbb4152dbf4561341e2be2835cb098\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.215 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e3020d46-aa69-493e-9ecc-3af773e17988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.216 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb7a0571-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.217 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:10 compute-0 kernel: tapdb7a0571-c0: left promiscuous mode
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.228 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.231 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4f1ffc-55cc-43e1-a98c-47c427eb53b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.245 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[43fdc714-4310-4f98-a5a0-8bc9153ce8cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.247 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[24cf351c-3366-416e-87df-abcd98b9b703]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.263 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8b33b604-e61d-45c4-987e-3cf798f749da]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370420, 'reachable_time': 37714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218356, 'error': None, 'target': 'ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.266 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-db7a0571-c9ff-4ba8-85be-9f66260c9300 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:11:10 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:10.266 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[94b63a05-149e-4edd-be90-46bf081fa696]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:10 compute-0 systemd[1]: run-netns-ovnmeta\x2ddb7a0571\x2dc9ff\x2d4ba8\x2d85be\x2d9f66260c9300.mount: Deactivated successfully.
Nov 22 10:11:10 compute-0 nova_compute[186981]: 2025-11-22 10:11:10.933 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.285 186985 DEBUG nova.network.neutron [-] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.311 186985 INFO nova.compute.manager [-] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Took 1.13 seconds to deallocate network for instance.
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.380 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.381 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.458 186985 DEBUG nova.compute.provider_tree [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.472 186985 DEBUG nova.scheduler.client.report [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.490 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.521 186985 INFO nova.scheduler.client.report [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 54acdc4c-5722-41ef-992f-2ac15ae8fdf9
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.578 186985 DEBUG nova.network.neutron [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updated VIF entry in instance network info cache for port f6334999-5b68-4a4f-b5e6-f5660d06217d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.579 186985 DEBUG nova.network.neutron [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Updating instance_info_cache with network_info: [{"id": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "address": "fa:16:3e:aa:f6:df", "network": {"id": "db7a0571-c9ff-4ba8-85be-9f66260c9300", "bridge": "br-int", "label": "tempest-network-smoke--681881421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6334999-5b", "ovs_interfaceid": "f6334999-5b68-4a4f-b5e6-f5660d06217d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.585 186985 DEBUG oslo_concurrency.lockutils [None req-618b7a63-956e-410d-b03a-1d7bceb18353 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.597 186985 DEBUG oslo_concurrency.lockutils [req-d949acc9-4c96-4834-96ad-383692ab93e5 req-6f82232a-4610-4b56-9808-da7882091784 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-54acdc4c-5722-41ef-992f-2ac15ae8fdf9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.824 186985 DEBUG nova.compute.manager [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-vif-unplugged-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.825 186985 DEBUG oslo_concurrency.lockutils [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.825 186985 DEBUG oslo_concurrency.lockutils [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.825 186985 DEBUG oslo_concurrency.lockutils [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.826 186985 DEBUG nova.compute.manager [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] No waiting events found dispatching network-vif-unplugged-f6334999-5b68-4a4f-b5e6-f5660d06217d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.826 186985 WARNING nova.compute.manager [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received unexpected event network-vif-unplugged-f6334999-5b68-4a4f-b5e6-f5660d06217d for instance with vm_state deleted and task_state None.
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.826 186985 DEBUG nova.compute.manager [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.827 186985 DEBUG oslo_concurrency.lockutils [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.827 186985 DEBUG oslo_concurrency.lockutils [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.827 186985 DEBUG oslo_concurrency.lockutils [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "54acdc4c-5722-41ef-992f-2ac15ae8fdf9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.827 186985 DEBUG nova.compute.manager [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] No waiting events found dispatching network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.828 186985 WARNING nova.compute.manager [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received unexpected event network-vif-plugged-f6334999-5b68-4a4f-b5e6-f5660d06217d for instance with vm_state deleted and task_state None.
Nov 22 10:11:11 compute-0 nova_compute[186981]: 2025-11-22 10:11:11.828 186985 DEBUG nova.compute.manager [req-66b64eea-1f34-4699-acb5-c95a885aa18c req-44a270b8-f5b0-49db-a07f-131970e3aef4 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Received event network-vif-deleted-f6334999-5b68-4a4f-b5e6-f5660d06217d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:14 compute-0 nova_compute[186981]: 2025-11-22 10:11:14.565 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:14 compute-0 podman[218358]: 2025-11-22 10:11:14.605792192 +0000 UTC m=+0.060065667 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 10:11:14 compute-0 podman[218359]: 2025-11-22 10:11:14.619962808 +0000 UTC m=+0.072048433 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:11:14 compute-0 nova_compute[186981]: 2025-11-22 10:11:14.633 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:15 compute-0 nova_compute[186981]: 2025-11-22 10:11:15.116 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:15 compute-0 nova_compute[186981]: 2025-11-22 10:11:15.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:15 compute-0 nova_compute[186981]: 2025-11-22 10:11:15.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:11:15 compute-0 nova_compute[186981]: 2025-11-22 10:11:15.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:11:15 compute-0 nova_compute[186981]: 2025-11-22 10:11:15.610 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:11:15 compute-0 nova_compute[186981]: 2025-11-22 10:11:15.969 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:16 compute-0 nova_compute[186981]: 2025-11-22 10:11:16.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:16 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:16.814 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:17 compute-0 nova_compute[186981]: 2025-11-22 10:11:17.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:17 compute-0 nova_compute[186981]: 2025-11-22 10:11:17.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:17 compute-0 podman[218397]: 2025-11-22 10:11:17.633487721 +0000 UTC m=+0.079537008 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 10:11:17 compute-0 podman[218398]: 2025-11-22 10:11:17.633546592 +0000 UTC m=+0.077629415 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 10:11:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:17.940 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:17.940 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:17.941 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.619 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.619 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.619 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.619 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.768 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.769 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.45878982543945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.769 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.769 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.823 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.823 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.841 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.853 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.871 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:11:18 compute-0 nova_compute[186981]: 2025-11-22 10:11:18.871 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:19 compute-0 nova_compute[186981]: 2025-11-22 10:11:19.865 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:19 compute-0 nova_compute[186981]: 2025-11-22 10:11:19.885 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:19 compute-0 nova_compute[186981]: 2025-11-22 10:11:19.885 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:11:20 compute-0 nova_compute[186981]: 2025-11-22 10:11:20.118 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:20 compute-0 nova_compute[186981]: 2025-11-22 10:11:20.609 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:20 compute-0 nova_compute[186981]: 2025-11-22 10:11:20.973 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:23 compute-0 nova_compute[186981]: 2025-11-22 10:11:23.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:11:25 compute-0 nova_compute[186981]: 2025-11-22 10:11:25.093 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806270.0921152, 54acdc4c-5722-41ef-992f-2ac15ae8fdf9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:11:25 compute-0 nova_compute[186981]: 2025-11-22 10:11:25.093 186985 INFO nova.compute.manager [-] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] VM Stopped (Lifecycle Event)
Nov 22 10:11:25 compute-0 nova_compute[186981]: 2025-11-22 10:11:25.120 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:25 compute-0 nova_compute[186981]: 2025-11-22 10:11:25.142 186985 DEBUG nova.compute.manager [None req-7a238fee-9c40-4519-9b61-315734684ed2 - - - - - -] [instance: 54acdc4c-5722-41ef-992f-2ac15ae8fdf9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:11:25 compute-0 nova_compute[186981]: 2025-11-22 10:11:25.975 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:29 compute-0 podman[218442]: 2025-11-22 10:11:29.593445198 +0000 UTC m=+0.050422654 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:11:30 compute-0 nova_compute[186981]: 2025-11-22 10:11:30.121 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:31 compute-0 nova_compute[186981]: 2025-11-22 10:11:31.013 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.434 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.434 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.454 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.529 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.529 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.536 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.536 186985 INFO nova.compute.claims [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.620 186985 DEBUG nova.compute.provider_tree [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.632 186985 DEBUG nova.scheduler.client.report [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.650 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.651 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.703 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.704 186985 DEBUG nova.network.neutron [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.722 186985 INFO nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.741 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.844 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.846 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.847 186985 INFO nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Creating image(s)
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.848 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.849 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.850 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.874 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.962 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.963 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.964 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:32 compute-0 nova_compute[186981]: 2025-11-22 10:11:32.978 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.050 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.051 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.084 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.085 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.086 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.140 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.142 186985 DEBUG nova.virt.disk.api [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.142 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.219 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.220 186985 DEBUG nova.virt.disk.api [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.220 186985 DEBUG nova.objects.instance [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.242 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.243 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Ensure instance console log exists: /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.243 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.243 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.244 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:33 compute-0 nova_compute[186981]: 2025-11-22 10:11:33.779 186985 DEBUG nova.policy [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:11:35 compute-0 nova_compute[186981]: 2025-11-22 10:11:35.123 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:36 compute-0 nova_compute[186981]: 2025-11-22 10:11:36.017 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:36 compute-0 nova_compute[186981]: 2025-11-22 10:11:36.263 186985 DEBUG nova.network.neutron [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Successfully created port: 8477606b-1e0e-478b-b3f5-5851cacc8594 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:11:36 compute-0 nova_compute[186981]: 2025-11-22 10:11:36.922 186985 DEBUG nova.network.neutron [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Successfully updated port: 8477606b-1e0e-478b-b3f5-5851cacc8594 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:11:36 compute-0 nova_compute[186981]: 2025-11-22 10:11:36.942 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:11:36 compute-0 nova_compute[186981]: 2025-11-22 10:11:36.942 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:11:36 compute-0 nova_compute[186981]: 2025-11-22 10:11:36.943 186985 DEBUG nova.network.neutron [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:11:37 compute-0 nova_compute[186981]: 2025-11-22 10:11:37.001 186985 DEBUG nova.compute.manager [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:37 compute-0 nova_compute[186981]: 2025-11-22 10:11:37.001 186985 DEBUG nova.compute.manager [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing instance network info cache due to event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:11:37 compute-0 nova_compute[186981]: 2025-11-22 10:11:37.002 186985 DEBUG oslo_concurrency.lockutils [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:11:37 compute-0 nova_compute[186981]: 2025-11-22 10:11:37.736 186985 DEBUG nova.network.neutron [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:11:38 compute-0 podman[218481]: 2025-11-22 10:11:38.597083811 +0000 UTC m=+0.053679443 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 10:11:38 compute-0 podman[218482]: 2025-11-22 10:11:38.640301817 +0000 UTC m=+0.092414628 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.776 186985 DEBUG nova.network.neutron [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.806 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.807 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Instance network_info: |[{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.807 186985 DEBUG oslo_concurrency.lockutils [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.807 186985 DEBUG nova.network.neutron [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.809 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Start _get_guest_xml network_info=[{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.813 186985 WARNING nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.817 186985 DEBUG nova.virt.libvirt.host [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.817 186985 DEBUG nova.virt.libvirt.host [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.823 186985 DEBUG nova.virt.libvirt.host [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.823 186985 DEBUG nova.virt.libvirt.host [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.823 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.824 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.824 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.824 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.824 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.825 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.825 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.825 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.825 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.825 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.826 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.826 186985 DEBUG nova.virt.hardware [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.828 186985 DEBUG nova.virt.libvirt.vif [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:11:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-432033049',display_name='tempest-TestNetworkBasicOps-server-432033049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-432033049',id=11,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFMjT3SILCckN7hccmQPdQJZ/KxZaeTzO5FvYoEKS1evzYdiPtDC27AgzmpjzTkQ0fm10422f6oVjdCb6vftsFGdHE/l6y7M018xvotYzDwfn0yofl/oqZm0j4BRjxoNXw==',key_name='tempest-TestNetworkBasicOps-397061661',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-nrdivubi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:11:32Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=0cc84ed6-e43a-4e94-8e2e-5a057bbfee73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.829 186985 DEBUG nova.network.os_vif_util [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.829 186985 DEBUG nova.network.os_vif_util [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:04:3e,bridge_name='br-int',has_traffic_filtering=True,id=8477606b-1e0e-478b-b3f5-5851cacc8594,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8477606b-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.830 186985 DEBUG nova.objects.instance [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.846 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <uuid>0cc84ed6-e43a-4e94-8e2e-5a057bbfee73</uuid>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <name>instance-0000000b</name>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-432033049</nova:name>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:11:38</nova:creationTime>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         <nova:port uuid="8477606b-1e0e-478b-b3f5-5851cacc8594">
Nov 22 10:11:38 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <system>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <entry name="serial">0cc84ed6-e43a-4e94-8e2e-5a057bbfee73</entry>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <entry name="uuid">0cc84ed6-e43a-4e94-8e2e-5a057bbfee73</entry>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </system>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <os>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   </os>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <features>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   </features>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.config"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:e9:04:3e"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <target dev="tap8477606b-1e"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/console.log" append="off"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <video>
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </video>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:11:38 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:11:38 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:11:38 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:11:38 compute-0 nova_compute[186981]: </domain>
Nov 22 10:11:38 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.847 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Preparing to wait for external event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.848 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.848 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.849 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.850 186985 DEBUG nova.virt.libvirt.vif [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:11:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-432033049',display_name='tempest-TestNetworkBasicOps-server-432033049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-432033049',id=11,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFMjT3SILCckN7hccmQPdQJZ/KxZaeTzO5FvYoEKS1evzYdiPtDC27AgzmpjzTkQ0fm10422f6oVjdCb6vftsFGdHE/l6y7M018xvotYzDwfn0yofl/oqZm0j4BRjxoNXw==',key_name='tempest-TestNetworkBasicOps-397061661',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-nrdivubi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:11:32Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=0cc84ed6-e43a-4e94-8e2e-5a057bbfee73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.850 186985 DEBUG nova.network.os_vif_util [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.851 186985 DEBUG nova.network.os_vif_util [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:04:3e,bridge_name='br-int',has_traffic_filtering=True,id=8477606b-1e0e-478b-b3f5-5851cacc8594,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8477606b-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.851 186985 DEBUG os_vif [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:04:3e,bridge_name='br-int',has_traffic_filtering=True,id=8477606b-1e0e-478b-b3f5-5851cacc8594,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8477606b-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.852 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.852 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.853 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.856 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.856 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8477606b-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.857 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8477606b-1e, col_values=(('external_ids', {'iface-id': '8477606b-1e0e-478b-b3f5-5851cacc8594', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:04:3e', 'vm-uuid': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:38 compute-0 NetworkManager[55425]: <info>  [1763806298.8593] manager: (tap8477606b-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.858 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.865 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.866 186985 INFO os_vif [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:04:3e,bridge_name='br-int',has_traffic_filtering=True,id=8477606b-1e0e-478b-b3f5-5851cacc8594,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8477606b-1e')
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.918 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.919 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.919 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:e9:04:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:11:38 compute-0 nova_compute[186981]: 2025-11-22 10:11:38.920 186985 INFO nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Using config drive
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.200 186985 INFO nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Creating config drive at /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.config
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.205 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzi_tv72d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.332 186985 DEBUG oslo_concurrency.processutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzi_tv72d" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:40 compute-0 kernel: tap8477606b-1e: entered promiscuous mode
Nov 22 10:11:40 compute-0 NetworkManager[55425]: <info>  [1763806300.4026] manager: (tap8477606b-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 22 10:11:40 compute-0 ovn_controller[95329]: 2025-11-22T10:11:40Z|00144|binding|INFO|Claiming lport 8477606b-1e0e-478b-b3f5-5851cacc8594 for this chassis.
Nov 22 10:11:40 compute-0 ovn_controller[95329]: 2025-11-22T10:11:40Z|00145|binding|INFO|8477606b-1e0e-478b-b3f5-5851cacc8594: Claiming fa:16:3e:e9:04:3e 10.100.0.13
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.402 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.405 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.424 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:04:3e 10.100.0.13'], port_security=['fa:16:3e:e9:04:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bcc47b5-14ed-4281-bc3d-05f871760286', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e465d698-ac20-4ffb-95b2-d7abfb45d591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4de6700b-c3c2-42de-95b4-e4178e78410b, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=8477606b-1e0e-478b-b3f5-5851cacc8594) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.425 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 8477606b-1e0e-478b-b3f5-5851cacc8594 in datapath 3bcc47b5-14ed-4281-bc3d-05f871760286 bound to our chassis
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.426 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3bcc47b5-14ed-4281-bc3d-05f871760286
Nov 22 10:11:40 compute-0 systemd-udevd[218541]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.437 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[3321eefd-79cf-4f9a-a6a7-f9e7c2b7e607]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.437 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3bcc47b5-11 in ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.440 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3bcc47b5-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.440 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[aed96a94-5b5b-4dcd-b1c3-8d504c7e5e0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.442 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[27761138-ff4b-4163-9204-86f837db50c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 NetworkManager[55425]: <info>  [1763806300.4472] device (tap8477606b-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:11:40 compute-0 NetworkManager[55425]: <info>  [1763806300.4480] device (tap8477606b-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:11:40 compute-0 systemd-machined[153303]: New machine qemu-11-instance-0000000b.
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.460 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[47d4b7aa-1d6b-4f36-bed0-eadce9eeed01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.460 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 ovn_controller[95329]: 2025-11-22T10:11:40Z|00146|binding|INFO|Setting lport 8477606b-1e0e-478b-b3f5-5851cacc8594 ovn-installed in OVS
Nov 22 10:11:40 compute-0 ovn_controller[95329]: 2025-11-22T10:11:40Z|00147|binding|INFO|Setting lport 8477606b-1e0e-478b-b3f5-5851cacc8594 up in Southbound
Nov 22 10:11:40 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.467 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.489 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[da06da91-0375-431a-9f7a-1f9cba19313d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.517 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[5912b339-5469-411a-9225-4216183efe98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.525 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[87dfb449-194f-45cb-a5c3-fddcaf3f8977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 NetworkManager[55425]: <info>  [1763806300.5264] manager: (tap3bcc47b5-10): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.551 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[50eb1c40-4438-4056-b0dd-bde866c89bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.554 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[48fa9478-ebe7-4eeb-9198-37d23ad89f2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 NetworkManager[55425]: <info>  [1763806300.5772] device (tap3bcc47b5-10): carrier: link connected
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.584 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[af600df4-28f2-4cba-8bb7-a07fe7face1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.604 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[b78b6cda-1cc7-43b1-bc22-1f3626d4c1c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bcc47b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:13:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375828, 'reachable_time': 44793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218577, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.618 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[39cd6496-6522-4135-9162-9732dba28198]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:1327'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375828, 'tstamp': 375828}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218578, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.637 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[05f0edb6-0b10-4b91-bb78-445088cd9338]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bcc47b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:13:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375828, 'reachable_time': 44793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218579, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.667 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[aef55a8b-ae6d-4293-b35b-c2f89d5fa116]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.725 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7975d7cb-aedd-49da-9a02-4e4c2b312e40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.726 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bcc47b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.726 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.727 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bcc47b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.728 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 NetworkManager[55425]: <info>  [1763806300.7294] manager: (tap3bcc47b5-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 22 10:11:40 compute-0 kernel: tap3bcc47b5-10: entered promiscuous mode
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.732 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.732 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3bcc47b5-10, col_values=(('external_ids', {'iface-id': 'f00269a4-e7d1-47d4-b0a8-3ef04c233d4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.733 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 ovn_controller[95329]: 2025-11-22T10:11:40Z|00148|binding|INFO|Releasing lport f00269a4-e7d1-47d4-b0a8-3ef04c233d4f from this chassis (sb_readonly=0)
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.743 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.744 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3bcc47b5-14ed-4281-bc3d-05f871760286.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3bcc47b5-14ed-4281-bc3d-05f871760286.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.745 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[4649ed04-f687-43f9-bb66-48449b252257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.746 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-3bcc47b5-14ed-4281-bc3d-05f871760286
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/3bcc47b5-14ed-4281-bc3d-05f871760286.pid.haproxy
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID 3bcc47b5-14ed-4281-bc3d-05f871760286
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:11:40 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:11:40.746 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'env', 'PROCESS_TAG=haproxy-3bcc47b5-14ed-4281-bc3d-05f871760286', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3bcc47b5-14ed-4281-bc3d-05f871760286.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.767 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806300.7671478, 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.768 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] VM Started (Lifecycle Event)
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.788 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.791 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806300.7714157, 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.791 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] VM Paused (Lifecycle Event)
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.807 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.811 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:11:40 compute-0 nova_compute[186981]: 2025-11-22 10:11:40.827 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:11:41 compute-0 nova_compute[186981]: 2025-11-22 10:11:41.018 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:41 compute-0 podman[218618]: 2025-11-22 10:11:41.092220347 +0000 UTC m=+0.054447754 container create 5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 10:11:41 compute-0 systemd[1]: Started libpod-conmon-5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8.scope.
Nov 22 10:11:41 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:11:41 compute-0 podman[218618]: 2025-11-22 10:11:41.065707665 +0000 UTC m=+0.027935182 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:11:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cce43ac059813df9444c6c775500ca10227a2c8f1ae0f7d49b639fd1ba19966f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:11:41 compute-0 podman[218618]: 2025-11-22 10:11:41.180245154 +0000 UTC m=+0.142472581 container init 5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 10:11:41 compute-0 podman[218618]: 2025-11-22 10:11:41.186287778 +0000 UTC m=+0.148515185 container start 5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 10:11:41 compute-0 neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286[218633]: [NOTICE]   (218637) : New worker (218639) forked
Nov 22 10:11:41 compute-0 neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286[218633]: [NOTICE]   (218637) : Loading success.
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.259 186985 DEBUG nova.compute.manager [req-e469c388-6647-4e5a-b286-a725fd1fdc85 req-055c7b4f-d834-404a-87fa-7bdd8238584e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.260 186985 DEBUG oslo_concurrency.lockutils [req-e469c388-6647-4e5a-b286-a725fd1fdc85 req-055c7b4f-d834-404a-87fa-7bdd8238584e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.260 186985 DEBUG oslo_concurrency.lockutils [req-e469c388-6647-4e5a-b286-a725fd1fdc85 req-055c7b4f-d834-404a-87fa-7bdd8238584e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.260 186985 DEBUG oslo_concurrency.lockutils [req-e469c388-6647-4e5a-b286-a725fd1fdc85 req-055c7b4f-d834-404a-87fa-7bdd8238584e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.260 186985 DEBUG nova.compute.manager [req-e469c388-6647-4e5a-b286-a725fd1fdc85 req-055c7b4f-d834-404a-87fa-7bdd8238584e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Processing event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.261 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.265 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806302.2648773, 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.265 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] VM Resumed (Lifecycle Event)
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.268 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.272 186985 INFO nova.virt.libvirt.driver [-] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Instance spawned successfully.
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.272 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.296 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.305 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.310 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.311 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.312 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.313 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.313 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.314 186985 DEBUG nova.virt.libvirt.driver [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.359 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.389 186985 INFO nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Took 9.54 seconds to spawn the instance on the hypervisor.
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.389 186985 DEBUG nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.449 186985 INFO nova.compute.manager [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Took 9.94 seconds to build instance.
Nov 22 10:11:42 compute-0 nova_compute[186981]: 2025-11-22 10:11:42.466 186985 DEBUG oslo_concurrency.lockutils [None req-567c98d2-8b20-46d9-9240-bb6ad9f49b16 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:43 compute-0 nova_compute[186981]: 2025-11-22 10:11:43.860 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.185 186985 DEBUG nova.network.neutron [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updated VIF entry in instance network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.186 186985 DEBUG nova.network.neutron [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.202 186985 DEBUG oslo_concurrency.lockutils [req-bff316e6-4314-454e-bc3e-e6f6b5b36d9b req-702ca524-128a-4fef-951a-7958aa322d86 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.343 186985 DEBUG nova.compute.manager [req-f3d7342a-24fc-4bdf-a846-593828070fcd req-f9f6b4d0-4129-46d4-82d3-b6604b152b98 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.343 186985 DEBUG oslo_concurrency.lockutils [req-f3d7342a-24fc-4bdf-a846-593828070fcd req-f9f6b4d0-4129-46d4-82d3-b6604b152b98 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.343 186985 DEBUG oslo_concurrency.lockutils [req-f3d7342a-24fc-4bdf-a846-593828070fcd req-f9f6b4d0-4129-46d4-82d3-b6604b152b98 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.343 186985 DEBUG oslo_concurrency.lockutils [req-f3d7342a-24fc-4bdf-a846-593828070fcd req-f9f6b4d0-4129-46d4-82d3-b6604b152b98 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.344 186985 DEBUG nova.compute.manager [req-f3d7342a-24fc-4bdf-a846-593828070fcd req-f9f6b4d0-4129-46d4-82d3-b6604b152b98 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] No waiting events found dispatching network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:11:44 compute-0 nova_compute[186981]: 2025-11-22 10:11:44.344 186985 WARNING nova.compute.manager [req-f3d7342a-24fc-4bdf-a846-593828070fcd req-f9f6b4d0-4129-46d4-82d3-b6604b152b98 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received unexpected event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 for instance with vm_state active and task_state None.
Nov 22 10:11:45 compute-0 podman[218648]: 2025-11-22 10:11:45.602654763 +0000 UTC m=+0.061710731 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:11:45 compute-0 podman[218649]: 2025-11-22 10:11:45.602746315 +0000 UTC m=+0.060998332 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 10:11:46 compute-0 nova_compute[186981]: 2025-11-22 10:11:46.092 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:47 compute-0 ovn_controller[95329]: 2025-11-22T10:11:47Z|00149|binding|INFO|Releasing lport f00269a4-e7d1-47d4-b0a8-3ef04c233d4f from this chassis (sb_readonly=0)
Nov 22 10:11:47 compute-0 NetworkManager[55425]: <info>  [1763806307.4337] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 22 10:11:47 compute-0 NetworkManager[55425]: <info>  [1763806307.4343] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.434 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:47 compute-0 ovn_controller[95329]: 2025-11-22T10:11:47Z|00150|binding|INFO|Releasing lport f00269a4-e7d1-47d4-b0a8-3ef04c233d4f from this chassis (sb_readonly=0)
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.465 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.468 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.736 186985 DEBUG nova.compute.manager [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.737 186985 DEBUG nova.compute.manager [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing instance network info cache due to event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.737 186985 DEBUG oslo_concurrency.lockutils [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.738 186985 DEBUG oslo_concurrency.lockutils [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:11:47 compute-0 nova_compute[186981]: 2025-11-22 10:11:47.738 186985 DEBUG nova.network.neutron [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:11:48 compute-0 podman[218687]: 2025-11-22 10:11:48.610138821 +0000 UTC m=+0.065261698 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:11:48 compute-0 podman[218686]: 2025-11-22 10:11:48.620228946 +0000 UTC m=+0.074797118 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:11:48 compute-0 nova_compute[186981]: 2025-11-22 10:11:48.862 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:49 compute-0 nova_compute[186981]: 2025-11-22 10:11:49.177 186985 DEBUG nova.network.neutron [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updated VIF entry in instance network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:11:49 compute-0 nova_compute[186981]: 2025-11-22 10:11:49.178 186985 DEBUG nova.network.neutron [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:11:49 compute-0 nova_compute[186981]: 2025-11-22 10:11:49.201 186985 DEBUG oslo_concurrency.lockutils [req-6bda4425-ca15-4e4f-ad7b-d8d69c3f2d54 req-28bd19b1-1581-4ca2-9a33-14a6e6d29b17 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:11:51 compute-0 nova_compute[186981]: 2025-11-22 10:11:51.094 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:53 compute-0 nova_compute[186981]: 2025-11-22 10:11:53.866 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:53 compute-0 ovn_controller[95329]: 2025-11-22T10:11:53Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:04:3e 10.100.0.13
Nov 22 10:11:53 compute-0 ovn_controller[95329]: 2025-11-22T10:11:53Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:04:3e 10.100.0.13
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.094 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.515 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.515 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.529 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.600 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.601 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.612 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.612 186985 INFO nova.compute.claims [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.726 186985 DEBUG nova.compute.provider_tree [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.739 186985 DEBUG nova.scheduler.client.report [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.760 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.761 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.804 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.805 186985 DEBUG nova.network.neutron [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.822 186985 INFO nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.840 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.921 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.922 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.923 186985 INFO nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Creating image(s)
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.923 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.924 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.924 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:56 compute-0 nova_compute[186981]: 2025-11-22 10:11:56.937 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.031 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.032 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.032 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.043 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.118 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.119 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.166 186985 DEBUG nova.policy [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.172 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.173 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.174 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.227 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.229 186985 DEBUG nova.virt.disk.api [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.230 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.289 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.290 186985 DEBUG nova.virt.disk.api [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.291 186985 DEBUG nova.objects.instance [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 17c3ed36-93e9-413b-ad7e-15f77d2951f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.307 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.308 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Ensure instance console log exists: /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.309 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.310 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.310 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:57 compute-0 nova_compute[186981]: 2025-11-22 10:11:57.868 186985 DEBUG nova.network.neutron [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Successfully created port: eb61cc86-c8e0-4eda-a84a-3d65295b0944 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.869 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.873 186985 DEBUG nova.network.neutron [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Successfully updated port: eb61cc86-c8e0-4eda-a84a-3d65295b0944 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.890 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.891 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.891 186985 DEBUG nova.network.neutron [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.962 186985 DEBUG nova.compute.manager [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-changed-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.963 186985 DEBUG nova.compute.manager [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Refreshing instance network info cache due to event network-changed-eb61cc86-c8e0-4eda-a84a-3d65295b0944. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:11:58 compute-0 nova_compute[186981]: 2025-11-22 10:11:58.963 186985 DEBUG oslo_concurrency.lockutils [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:11:58 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.002 186985 DEBUG nova.network.neutron [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.653 186985 DEBUG nova.network.neutron [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updating instance_info_cache with network_info: [{"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.676 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.676 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Instance network_info: |[{"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.677 186985 DEBUG oslo_concurrency.lockutils [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.677 186985 DEBUG nova.network.neutron [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Refreshing network info cache for port eb61cc86-c8e0-4eda-a84a-3d65295b0944 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.680 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Start _get_guest_xml network_info=[{"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.686 186985 WARNING nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.697 186985 DEBUG nova.virt.libvirt.host [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.697 186985 DEBUG nova.virt.libvirt.host [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.702 186985 DEBUG nova.virt.libvirt.host [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.703 186985 DEBUG nova.virt.libvirt.host [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.704 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.704 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.704 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.705 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.705 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.705 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.705 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.706 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.706 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.706 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.707 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.707 186985 DEBUG nova.virt.hardware [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.711 186985 DEBUG nova.virt.libvirt.vif [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:11:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-867812263',display_name='tempest-TestNetworkBasicOps-server-867812263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-867812263',id=12,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNiGzHVfNRblYz4Oj3U14Lxniim9sMmEJRIjLtwT8EFdvSQjQNjZXAeN/EbWFjndF8YRi51URdR2HBrMGZTwNn7ahCwB6paEmxUEalYL9hIE0q5QlinSiU0G2FyPmsPVkQ==',key_name='tempest-TestNetworkBasicOps-1075170225',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-d4kmj3e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:11:56Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=17c3ed36-93e9-413b-ad7e-15f77d2951f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.711 186985 DEBUG nova.network.os_vif_util [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.712 186985 DEBUG nova.network.os_vif_util [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ad:c3,bridge_name='br-int',has_traffic_filtering=True,id=eb61cc86-c8e0-4eda-a84a-3d65295b0944,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb61cc86-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.713 186985 DEBUG nova.objects.instance [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17c3ed36-93e9-413b-ad7e-15f77d2951f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.733 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <uuid>17c3ed36-93e9-413b-ad7e-15f77d2951f1</uuid>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <name>instance-0000000c</name>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-867812263</nova:name>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:11:59</nova:creationTime>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         <nova:port uuid="eb61cc86-c8e0-4eda-a84a-3d65295b0944">
Nov 22 10:11:59 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <system>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <entry name="serial">17c3ed36-93e9-413b-ad7e-15f77d2951f1</entry>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <entry name="uuid">17c3ed36-93e9-413b-ad7e-15f77d2951f1</entry>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </system>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <os>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   </os>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <features>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   </features>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.config"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:a4:ad:c3"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <target dev="tapeb61cc86-c8"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/console.log" append="off"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <video>
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </video>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:11:59 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:11:59 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:11:59 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:11:59 compute-0 nova_compute[186981]: </domain>
Nov 22 10:11:59 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.734 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Preparing to wait for external event network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.734 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.735 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.735 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.736 186985 DEBUG nova.virt.libvirt.vif [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:11:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-867812263',display_name='tempest-TestNetworkBasicOps-server-867812263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-867812263',id=12,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNiGzHVfNRblYz4Oj3U14Lxniim9sMmEJRIjLtwT8EFdvSQjQNjZXAeN/EbWFjndF8YRi51URdR2HBrMGZTwNn7ahCwB6paEmxUEalYL9hIE0q5QlinSiU0G2FyPmsPVkQ==',key_name='tempest-TestNetworkBasicOps-1075170225',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-d4kmj3e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:11:56Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=17c3ed36-93e9-413b-ad7e-15f77d2951f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.736 186985 DEBUG nova.network.os_vif_util [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.737 186985 DEBUG nova.network.os_vif_util [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ad:c3,bridge_name='br-int',has_traffic_filtering=True,id=eb61cc86-c8e0-4eda-a84a-3d65295b0944,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb61cc86-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.737 186985 DEBUG os_vif [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ad:c3,bridge_name='br-int',has_traffic_filtering=True,id=eb61cc86-c8e0-4eda-a84a-3d65295b0944,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb61cc86-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.738 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.738 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.739 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.743 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.743 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb61cc86-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.744 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeb61cc86-c8, col_values=(('external_ids', {'iface-id': 'eb61cc86-c8e0-4eda-a84a-3d65295b0944', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:ad:c3', 'vm-uuid': '17c3ed36-93e9-413b-ad7e-15f77d2951f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.783 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:59 compute-0 NetworkManager[55425]: <info>  [1763806319.7862] manager: (tapeb61cc86-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.787 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.790 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.791 186985 INFO os_vif [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ad:c3,bridge_name='br-int',has_traffic_filtering=True,id=eb61cc86-c8e0-4eda-a84a-3d65295b0944,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb61cc86-c8')
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.848 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.848 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.848 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:a4:ad:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:11:59 compute-0 nova_compute[186981]: 2025-11-22 10:11:59.849 186985 INFO nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Using config drive
Nov 22 10:11:59 compute-0 podman[218756]: 2025-11-22 10:11:59.895746387 +0000 UTC m=+0.065816014 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:12:01 compute-0 nova_compute[186981]: 2025-11-22 10:12:01.097 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.162 186985 INFO nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Creating config drive at /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.config
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.172 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe035zv40 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.301 186985 DEBUG oslo_concurrency.processutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe035zv40" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:02 compute-0 kernel: tapeb61cc86-c8: entered promiscuous mode
Nov 22 10:12:02 compute-0 NetworkManager[55425]: <info>  [1763806322.3668] manager: (tapeb61cc86-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 22 10:12:02 compute-0 ovn_controller[95329]: 2025-11-22T10:12:02Z|00151|binding|INFO|Claiming lport eb61cc86-c8e0-4eda-a84a-3d65295b0944 for this chassis.
Nov 22 10:12:02 compute-0 ovn_controller[95329]: 2025-11-22T10:12:02Z|00152|binding|INFO|eb61cc86-c8e0-4eda-a84a-3d65295b0944: Claiming fa:16:3e:a4:ad:c3 10.100.0.5
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.367 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.386 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ad:c3 10.100.0.5'], port_security=['fa:16:3e:a4:ad:c3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bcc47b5-14ed-4281-bc3d-05f871760286', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c7fdb23-f90b-44b5-b277-8b2cd6211afa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4de6700b-c3c2-42de-95b4-e4178e78410b, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=eb61cc86-c8e0-4eda-a84a-3d65295b0944) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:12:02 compute-0 ovn_controller[95329]: 2025-11-22T10:12:02Z|00153|binding|INFO|Setting lport eb61cc86-c8e0-4eda-a84a-3d65295b0944 ovn-installed in OVS
Nov 22 10:12:02 compute-0 ovn_controller[95329]: 2025-11-22T10:12:02Z|00154|binding|INFO|Setting lport eb61cc86-c8e0-4eda-a84a-3d65295b0944 up in Southbound
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.387 104216 INFO neutron.agent.ovn.metadata.agent [-] Port eb61cc86-c8e0-4eda-a84a-3d65295b0944 in datapath 3bcc47b5-14ed-4281-bc3d-05f871760286 bound to our chassis
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.388 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3bcc47b5-14ed-4281-bc3d-05f871760286
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.388 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.389 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:02 compute-0 systemd-udevd[218795]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:12:02 compute-0 NetworkManager[55425]: <info>  [1763806322.4060] device (tapeb61cc86-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:12:02 compute-0 NetworkManager[55425]: <info>  [1763806322.4084] device (tapeb61cc86-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.408 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e13097c7-4f34-42fe-a141-57080f65219e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:02 compute-0 systemd-machined[153303]: New machine qemu-12-instance-0000000c.
Nov 22 10:12:02 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.442 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[03418aac-7efd-4caa-8b28-5fa9ac106a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.448 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[d9357f2f-ac4b-4b99-bbf8-1444b71cd62f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.482 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1c3300-7f2c-46ab-a050-c06dac0467e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.496 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[ea43d690-2c4b-4bc9-baa7-cd5e742ffd4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bcc47b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:13:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375828, 'reachable_time': 44793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218809, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.519 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5842799a-037a-4c31-aaeb-f3313046d2b5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375839, 'tstamp': 375839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218812, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375842, 'tstamp': 375842}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218812, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.520 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bcc47b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.522 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.523 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.524 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bcc47b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.524 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.525 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3bcc47b5-10, col_values=(('external_ids', {'iface-id': 'f00269a4-e7d1-47d4-b0a8-3ef04c233d4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:02 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:02.525 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.745 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806322.745019, 17c3ed36-93e9-413b-ad7e-15f77d2951f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.746 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] VM Started (Lifecycle Event)
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.767 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.771 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806322.7461126, 17c3ed36-93e9-413b-ad7e-15f77d2951f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.772 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] VM Paused (Lifecycle Event)
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.789 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.794 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:12:02 compute-0 nova_compute[186981]: 2025-11-22 10:12:02.815 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.444 186985 DEBUG nova.compute.manager [req-3ee99c82-d8b2-4fd1-b1d1-5f23abe3c218 req-934f5c66-b91f-4cce-a988-1a4f05667fc2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.445 186985 DEBUG oslo_concurrency.lockutils [req-3ee99c82-d8b2-4fd1-b1d1-5f23abe3c218 req-934f5c66-b91f-4cce-a988-1a4f05667fc2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.445 186985 DEBUG oslo_concurrency.lockutils [req-3ee99c82-d8b2-4fd1-b1d1-5f23abe3c218 req-934f5c66-b91f-4cce-a988-1a4f05667fc2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.445 186985 DEBUG oslo_concurrency.lockutils [req-3ee99c82-d8b2-4fd1-b1d1-5f23abe3c218 req-934f5c66-b91f-4cce-a988-1a4f05667fc2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.445 186985 DEBUG nova.compute.manager [req-3ee99c82-d8b2-4fd1-b1d1-5f23abe3c218 req-934f5c66-b91f-4cce-a988-1a4f05667fc2 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Processing event network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.446 186985 DEBUG nova.network.neutron [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updated VIF entry in instance network info cache for port eb61cc86-c8e0-4eda-a84a-3d65295b0944. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.446 186985 DEBUG nova.network.neutron [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updating instance_info_cache with network_info: [{"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.448 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.451 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806323.4517257, 17c3ed36-93e9-413b-ad7e-15f77d2951f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.452 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] VM Resumed (Lifecycle Event)
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.453 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.455 186985 INFO nova.virt.libvirt.driver [-] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Instance spawned successfully.
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.456 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.478 186985 DEBUG oslo_concurrency.lockutils [req-30ff19b6-de5c-4ff2-bce1-1008c2fcfc7f req-e5b2d53d-74da-48bc-97a7-02bf3fdba33e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.479 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.483 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.483 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.484 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.484 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.485 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.485 186985 DEBUG nova.virt.libvirt.driver [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.488 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.515 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.541 186985 INFO nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Took 6.62 seconds to spawn the instance on the hypervisor.
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.542 186985 DEBUG nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.598 186985 INFO nova.compute.manager [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Took 7.03 seconds to build instance.
Nov 22 10:12:03 compute-0 nova_compute[186981]: 2025-11-22 10:12:03.614 186985 DEBUG oslo_concurrency.lockutils [None req-28357ce7-da9c-4c2b-bc83-05c0b246c566 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:04 compute-0 nova_compute[186981]: 2025-11-22 10:12:04.785 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:05 compute-0 nova_compute[186981]: 2025-11-22 10:12:05.413 186985 DEBUG nova.compute.manager [req-f0874f8a-89f5-41be-94ec-649afac93be8 req-8caa1daf-5311-45cf-be9b-b2a24c9efc1b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:05 compute-0 nova_compute[186981]: 2025-11-22 10:12:05.414 186985 DEBUG oslo_concurrency.lockutils [req-f0874f8a-89f5-41be-94ec-649afac93be8 req-8caa1daf-5311-45cf-be9b-b2a24c9efc1b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:05 compute-0 nova_compute[186981]: 2025-11-22 10:12:05.414 186985 DEBUG oslo_concurrency.lockutils [req-f0874f8a-89f5-41be-94ec-649afac93be8 req-8caa1daf-5311-45cf-be9b-b2a24c9efc1b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:05 compute-0 nova_compute[186981]: 2025-11-22 10:12:05.415 186985 DEBUG oslo_concurrency.lockutils [req-f0874f8a-89f5-41be-94ec-649afac93be8 req-8caa1daf-5311-45cf-be9b-b2a24c9efc1b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:05 compute-0 nova_compute[186981]: 2025-11-22 10:12:05.415 186985 DEBUG nova.compute.manager [req-f0874f8a-89f5-41be-94ec-649afac93be8 req-8caa1daf-5311-45cf-be9b-b2a24c9efc1b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] No waiting events found dispatching network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:05 compute-0 nova_compute[186981]: 2025-11-22 10:12:05.416 186985 WARNING nova.compute.manager [req-f0874f8a-89f5-41be-94ec-649afac93be8 req-8caa1daf-5311-45cf-be9b-b2a24c9efc1b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received unexpected event network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 for instance with vm_state active and task_state None.
Nov 22 10:12:06 compute-0 nova_compute[186981]: 2025-11-22 10:12:06.098 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:08 compute-0 nova_compute[186981]: 2025-11-22 10:12:08.462 186985 DEBUG nova.compute.manager [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-changed-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:08 compute-0 nova_compute[186981]: 2025-11-22 10:12:08.462 186985 DEBUG nova.compute.manager [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Refreshing instance network info cache due to event network-changed-eb61cc86-c8e0-4eda-a84a-3d65295b0944. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:12:08 compute-0 nova_compute[186981]: 2025-11-22 10:12:08.463 186985 DEBUG oslo_concurrency.lockutils [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:08 compute-0 nova_compute[186981]: 2025-11-22 10:12:08.463 186985 DEBUG oslo_concurrency.lockutils [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:08 compute-0 nova_compute[186981]: 2025-11-22 10:12:08.463 186985 DEBUG nova.network.neutron [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Refreshing network info cache for port eb61cc86-c8e0-4eda-a84a-3d65295b0944 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:12:09 compute-0 podman[218821]: 2025-11-22 10:12:09.614505756 +0000 UTC m=+0.064255311 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 22 10:12:09 compute-0 podman[218822]: 2025-11-22 10:12:09.643575397 +0000 UTC m=+0.094907524 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 10:12:09 compute-0 nova_compute[186981]: 2025-11-22 10:12:09.788 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:10 compute-0 nova_compute[186981]: 2025-11-22 10:12:10.443 186985 DEBUG nova.network.neutron [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updated VIF entry in instance network info cache for port eb61cc86-c8e0-4eda-a84a-3d65295b0944. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:12:10 compute-0 nova_compute[186981]: 2025-11-22 10:12:10.444 186985 DEBUG nova.network.neutron [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updating instance_info_cache with network_info: [{"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:10 compute-0 nova_compute[186981]: 2025-11-22 10:12:10.468 186985 DEBUG oslo_concurrency.lockutils [req-a0c26131-150b-4fc6-bfca-c1bfa1750471 req-298e166d-16ec-483f-8640-451aaaedb2e5 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:11 compute-0 nova_compute[186981]: 2025-11-22 10:12:11.101 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:14 compute-0 nova_compute[186981]: 2025-11-22 10:12:14.818 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:15 compute-0 ovn_controller[95329]: 2025-11-22T10:12:15Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:ad:c3 10.100.0.5
Nov 22 10:12:15 compute-0 ovn_controller[95329]: 2025-11-22T10:12:15Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:ad:c3 10.100.0.5
Nov 22 10:12:15 compute-0 nova_compute[186981]: 2025-11-22 10:12:15.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:15 compute-0 nova_compute[186981]: 2025-11-22 10:12:15.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 10:12:15 compute-0 nova_compute[186981]: 2025-11-22 10:12:15.609 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 10:12:16 compute-0 nova_compute[186981]: 2025-11-22 10:12:16.104 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:16 compute-0 nova_compute[186981]: 2025-11-22 10:12:16.608 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:16 compute-0 nova_compute[186981]: 2025-11-22 10:12:16.609 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:12:16 compute-0 nova_compute[186981]: 2025-11-22 10:12:16.609 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:12:16 compute-0 podman[218881]: 2025-11-22 10:12:16.618848095 +0000 UTC m=+0.060898589 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 10:12:16 compute-0 podman[218880]: 2025-11-22 10:12:16.627509021 +0000 UTC m=+0.063082249 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.840 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'name': 'tempest-TestNetworkBasicOps-server-432033049', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'user_id': 'fd88a700663e44618f0a22f234573806', 'hostId': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.842 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'name': 'tempest-TestNetworkBasicOps-server-867812263', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'user_id': 'fd88a700663e44618f0a22f234573806', 'hostId': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.843 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.877 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.read.latency volume: 531386722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.878 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.read.latency volume: 82978737 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.909 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.read.latency volume: 475866523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.909 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.read.latency volume: 94279563 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d0c7438-db98-42e6-8c1b-60f86671e4f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 531386722, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.843268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9208632-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': '4131d2f9685318910b3e5196014cdbf20e3310b2623e1e67df1e207cab4b1aa7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 82978737, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.843268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9209a32-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': 'c90ae4d22c5c436542bdc694eca08955edb3fde609275855bd55fb0f66d8df53'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 475866523, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.843268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9254d84-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': '770c1aa89168a3715eecb04bb71ffabc4c941db88b514f72ac1cde6c70844184'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 94279563, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.843268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9255cac-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': 'cfd8edb85dd1b3479f9f2a98b2f29023b1c67f39935c5631b164bec60f7b2c2e'}]}, 'timestamp': '2025-11-22 10:12:16.910192', '_unique_id': 'e9f33b9966a04cabaf15da4054efa227'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.914 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 / tap8477606b-1e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.914 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.916 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 17c3ed36-93e9-413b-ad7e-15f77d2951f1 / tapeb61cc86-c8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.916 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acec4e6d-c189-4c76-9399-5c17a7529f4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.912473', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b92619bc-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '3c5cc4417a5065b1347a73dc9fc2e21974c73d95cc34d8b349fc181b750f5bd9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.912473', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b9266a8e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': '0bd7bae9917d1805596652d7affdbb592473f41c68f20dbdf8554f061fbfd3e4'}]}, 'timestamp': '2025-11-22 10:12:16.917110', '_unique_id': 'ee3ed3d3d3ae4a21b75274ca81d96240'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.918 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.write.bytes volume: 72933376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.919 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.919 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.919 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a61d7166-fa51-4749-98c7-5d619a6fe919', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72933376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.918791', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b926b868-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': '2ee5942cc8eb862d9395993691f24dd4c2bf8fe5fde0962d8c5f38e42f3b2f50'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.918791', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b926c470-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': '12d4665ccdb7b04254b76da453a6493f6534058adb81d3bff06afa4a3f009c28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.918791', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b926cf24-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': '713d1e94edf94d03124fe41db168e49780215d8df4f4ed78239a54b6f2cb9382'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.918791', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b926dadc-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': '716dd34d8cc8576d0bd923ebaa0a7a858aa44dcb811198069b0d8f4b20d05c8a'}]}, 'timestamp': '2025-11-22 10:12:16.919969', '_unique_id': '74fd6dbb1d5b4712bbd5d6443803cbb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.921 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.921 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33e9999a-66e0-414c-9874-203a7636dda6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.921501', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b927232a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': 'd6f2eeb2eded18bae114238c820b99e4f29a0ac23b33fadb899b4f081a3087ed'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.921501', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b9272ec4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': 'fca73518082eeb81915524ebf5bfe95737f03699c05e25754d6210d160348a14'}]}, 'timestamp': '2025-11-22 10:12:16.922134', '_unique_id': '133dd89121ee4bcb96407cc17521d699'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.923 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.read.bytes volume: 31111680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.924 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.924 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.read.bytes volume: 30800384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.924 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7795cc63-98d0-4d18-a962-07e06b71e7cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31111680, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.923841', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9277d52-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': '45c96b082617e92b78d49264bf3e282b524f467657ebf27cabd42729b1ccf3da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.923841', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b92786f8-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': '4df3fcf7bd2180dde0c8e7269ec96f48cf8517f613a049d228d479096e527e30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30800384, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.923841', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b927918e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': 'cf88d659ef2dac652e6a2ec7aebc9450921bac22e40c8d513af09fb98a3e2840'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.923841', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9279d6e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': 'edd9e9768ecf6ef3d425bfd369a87d4e90bd10d37686bfd1114757ef6674764c'}]}, 'timestamp': '2025-11-22 10:12:16.924936', '_unique_id': 'df74f6a54881408f946b041c7f362701'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.926 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.write.latency volume: 2478415663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.926 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.926 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.write.latency volume: 2274729514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.926 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b31336c5-d4be-413b-8ffe-257e49b10dd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2478415663, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.926253', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b927da40-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': '8139ef6b2d61c80d4b026b32b62e3c0f04a24e70f0663334474b586338895a3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.926253', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b927e36e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': 'e87dbab86043ed220973026fe942512e1aa2360e3c3258ce5f49f6291288d212'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2274729514, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.926253', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b927eb52-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': 'f348420a288b4a0f07a950b4486c1c6b8983c0b76422163b66a8ca1ae0a86921'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.926253', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b927f2b4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': 'a10773e96d5a982131f0de0881f30f25c733d4f5e372769d7ef4bb4c0c602d65'}]}, 'timestamp': '2025-11-22 10:12:16.927103', '_unique_id': '582a669c8656485f92c555ba9c820f97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.937 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.938 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.946 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.946 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '310eb334-d668-401b-9ed4-1a2388b141cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.928265', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9299f42-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.697754763, 'message_signature': '8684913743a49c200089372341761d316b6cdb01e935b8a2112aee9f7427de58'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.928265', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b929a898-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.697754763, 'message_signature': 'b1057037cee80f9e2bc5201222468eb1a5dbd0e67f4cbcd3e5eeeae336da64cd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.928265', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b92ae91a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.707772806, 'message_signature': 'a83e6db6691fd27757f54d0b69b34b23a227715e74dc016e33eab5a5baa6c3a5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.928265', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b92af2c0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.707772806, 'message_signature': '80564951df570f37c590c26fb20b993094bfd06081773a09b349036910d3fa07'}]}, 'timestamp': '2025-11-22 10:12:16.946765', '_unique_id': '379c85823ffd4f5a9add66f47669ec68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.948 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.948 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.948 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.948 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.write.requests volume: 325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e610427-9a4d-4d83-ae30-2f76e7a61b50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.948349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b92b3a00-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': '7a9686815c7b7d4a040b5d35562b18791ca6c0de7f4ea66fedc54a46187fd604'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.948349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b92b4554-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': 'c03bd6878e667aca6e92e9265f0c368bf559b4eab21e7ed1716409e840a74748'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 325, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.948349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b92b4db0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': 'b46f41162334d907b79c11333c94a54dbce9b9f515893c41d5ae1c2c35e092cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.948349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b92b55bc-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': '85a6cbfd67dba5013e6a8f9fd822cfc6aeabcd60a9f1475a42a0af9e6eb09c55'}]}, 'timestamp': '2025-11-22 10:12:16.949285', '_unique_id': 'fcdc47a071684767a6bb6548db0ab527'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.950 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.950 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>]
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.950 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.read.requests volume: 1140 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.951 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.951 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.read.requests volume: 1125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.951 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a134895-64a3-4ed8-a752-ba8071adf678', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1140, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.950839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b92b9aae-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': 'd244787042237181c6275468c359ed476dcbee23945b66772f108fcd4c5853a4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.950839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b92ba29c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.612759269, 'message_signature': 'f8ddcf56e8d27d1844be6d1589e64558abb8ae0a2e8d24a53fb2e08dc5237ab0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1125, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.950839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b92baa12-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': 'ebf8a3f92dc335b12af905a96d4166540c64856ffefa71055f677cc5bbe940a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.950839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b92bb142-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.648495452, 'message_signature': '84284e816a37bf126c55c3f76d9b055f9d147ed89b30393b007dffaf6fd08046'}]}, 'timestamp': '2025-11-22 10:12:16.951643', '_unique_id': '316ec011dcd641d8a03aeed13acc0888'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.952 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.966 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/cpu volume: 11250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.978 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/cpu volume: 11330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f91032d-1650-4328-878e-89dbb7fc1842', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11250000000, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'timestamp': '2025-11-22T10:12:16.952840', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b92df614-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.735597323, 'message_signature': '145da4a202386d2c68e12d97816998f646aec9d3c7047f791a33aa278f83fb71'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11330000000, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'timestamp': '2025-11-22T10:12:16.952840', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b92fc71e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.747501958, 'message_signature': 'db12048fc68aa2168c36e28b1a0f283295e96a42d4e33ac59c65a29137364cea'}]}, 'timestamp': '2025-11-22 10:12:16.978448', '_unique_id': '859b88ea123e4674ad29ce70ca4ca368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.980 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.980 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.incoming.bytes volume: 1642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69c8c4c3-1a96-447a-b9ed-bca25aee79a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.980121', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b93014a8-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': 'f31358aacf12b917a99228ab511129e65adaa91acaec8fa31808e3c758493ad1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1642, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.980121', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b9301d68-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': 'a7484e418113e042f1d50520a48a63e31e8eb660e8e26fd18c94d504a510f8db'}]}, 'timestamp': '2025-11-22 10:12:16.980646', '_unique_id': '5702c4c7a4dc45fb8bffac44e62642b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.981 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>]
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.982 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.982 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>]
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.982 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.982 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16df4479-31a7-4380-9640-2f8e8970e1ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.982369', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b9306a3e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '4e5f6519f4d23e4440687a01d197beade1fb5821cecdbb9412561715d8d6e840'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.982369', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b930734e-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': 'd0b2c54a9896173203c2a87e6d169f9882bde4d802e013bbf100a7b4c4931b08'}]}, 'timestamp': '2025-11-22 10:12:16.982816', '_unique_id': 'f30fd151d947498e8fdb6ebcac6a8d91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.983 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-432033049>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-867812263>]
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.984 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.984 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f97dd5ab-5e5f-4ad5-9eb3-bb00fda4ac95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.984174', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b930b0d4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '795bd04c5108b0115fd4c5f11cba5d7ee0beb80f503a43e6dedb94fe8c6cc340'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.984174', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b930b8fe-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': '5ab84cd6c377049bcbbc7dd708a3df2063b436fab903fa2732559c2db56248be'}]}, 'timestamp': '2025-11-22 10:12:16.984624', '_unique_id': '519f1eba969d4e9c89109c8812532f90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.985 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65c3d61e-d714-494a-91df-38cb814d14b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.985698', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b930ec84-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '96dabc3e365a13ec55ccfca91cfe69b5e4fe8e190c990ebe393ccf383e13dad4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.985698', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b930f4ae-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': 'a372491562a826f3f533f500d32deb67b94d245f9fa5e14feb081789626bf750'}]}, 'timestamp': '2025-11-22 10:12:16.986126', '_unique_id': '74cd4c57f0f0470b9c95b7de8c5be363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.986 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.987 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.987 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '382c14e6-58e6-4bc2-aee2-2d289bac47b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.987188', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b9312668-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '74c0f7737e61ff747f36b89e78353faa1efadc6786e13061f412b228e6f8b9ee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.987188', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b9312ef6-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': '2c26ab7ee0475fb8773ef07013b43c965188bf0e222b9b412c8aa89cf9441cd6'}]}, 'timestamp': '2025-11-22 10:12:16.987637', '_unique_id': '4abb45ebdac04a59a97aa59f6a9a5521'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.988 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.outgoing.bytes volume: 1194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dca3f00a-a949-4c15-bf7f-835672313040', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.988729', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b931633a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '04fc3237d84b82555efa5c15d583be6f43503fab19f648b866306c1f973192a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1194, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.988729', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b9316cf4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': 'efea558fd7d6cc07be44fe58ad8d3ed8ad198ddf6505610f369982517ce46ff8'}]}, 'timestamp': '2025-11-22 10:12:16.989208', '_unique_id': '9e1c3476898c4dca850df4a216178333'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.989 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.990 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.990 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/memory.usage volume: 42.66015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.990 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/memory.usage volume: 40.40625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f51a9fd-b1c5-4aff-b6e9-9d81eb61483b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.66015625, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'timestamp': '2025-11-22T10:12:16.990453', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b931a69c-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.735597323, 'message_signature': 'ec8040ae02e88f0e542a7bbaae3982e48ac9428eeca71aada7cc551ca8a20ad8'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.40625, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'timestamp': '2025-11-22T10:12:16.990453', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b931aea8-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.747501958, 'message_signature': '140a07a0e38a5d43a90c03b2f9c358850d7d603caaf624f6d70cccf1cd87468c'}]}, 'timestamp': '2025-11-22 10:12:16.990881', '_unique_id': '6c04ccccf2334b1a917e5fa56252dec2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.991 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.992 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.992 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.992 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.992 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31799159-3250-4d6b-9cf0-03dd0a4b9efc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.992015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b931e3d2-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.697754763, 'message_signature': 'b93daa50b8f2c0364f2e4ed6c59f8c8203edc8bb1c8d053eef2c1aba51c09415'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.992015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b931eb70-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.697754763, 'message_signature': '68a2b3b835bdc770a760fac02386d604fdcb9ff452a7c19b25b0cac4d83d55d2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.992015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b931f2c8-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.707772806, 'message_signature': '21d7f9fe33dacab45baf2fecdb20193e1bc04e35bad5e39aac13427bacb29759'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.992015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b931fad4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.707772806, 'message_signature': '43989385f124ad9023fccd5816dd340e85dd34fe56668b724c0d9a1490deaee5'}]}, 'timestamp': '2025-11-22 10:12:16.992828', '_unique_id': '7ebf382e86c540959cf3ed52d166f9b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.993 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.994 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.994 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.994 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8eb8b0c9-1618-48af-a072-b9499edd49f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-vda', 'timestamp': '2025-11-22T10:12:16.993949', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9322eb4-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.697754763, 'message_signature': '303aeacf64204d5e4ae3360ee0b0467078104b9b6d2ed2515152f81b1991daea'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-sda', 'timestamp': '2025-11-22T10:12:16.993949', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'instance-0000000b', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9323850-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.697754763, 'message_signature': '74b6de552fb657ebb1ad683e7ffc24fb1bf625aab139756c5646cafd23bf3e12'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-vda', 'timestamp': '2025-11-22T10:12:16.993949', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9323fee-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.707772806, 'message_signature': '5bcd448a210f2ba244fd262b687ed30a59f4e375e1b63670a20f031e3cf23611'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1-sda', 'timestamp': '2025-11-22T10:12:16.993949', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'instance-0000000c', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b93247f0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.707772806, 'message_signature': 'd0effa559fc210e8adfedf90973941592d90ea390141b32845d46c7f3b20c0e2'}]}, 'timestamp': '2025-11-22 10:12:16.994802', '_unique_id': 'c0fe3d679e4e440697affb24b2b35c18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.995 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10a3bdd3-06f3-4eba-a46c-68217cc8c822', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.995960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b9327d1a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '9cea7cd5f5e521d1cbd2d5211f5b466023fe1d7a31144ea0b4fffd95e87ddc54'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.995960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b93286c0-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': '0ddcda9aac72d19c2479083b5254c3b1797dde2f8a60a44aeea06ec602d127e7'}]}, 'timestamp': '2025-11-22 10:12:16.996421', '_unique_id': '8b911a42a1fc4c65a693385a5955fa44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.996 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.997 12 DEBUG ceilometer.compute.pollsters [-] 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.997 12 DEBUG ceilometer.compute.pollsters [-] 17c3ed36-93e9-413b-ad7e-15f77d2951f1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '327536ef-5f33-4bd9-8ad2-70e81bfd6ab3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000b-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-tap8477606b-1e', 'timestamp': '2025-11-22T10:12:16.997526', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-432033049', 'name': 'tap8477606b-1e', 'instance_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:04:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8477606b-1e'}, 'message_id': 'b932bb22-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.681972803, 'message_signature': '8acaf8969aeadf3154c0a610d5ce7869bb01639ca57c5d8e614ecde550d311d7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_name': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_name': None, 'resource_id': 'instance-0000000c-17c3ed36-93e9-413b-ad7e-15f77d2951f1-tapeb61cc86-c8', 'timestamp': '2025-11-22T10:12:16.997526', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-867812263', 'name': 'tapeb61cc86-c8', 'instance_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'instance_type': 'm1.nano', 'host': '11a63b98d0f64f18b759de738863d6e648fcaf876dd8b9a2a35fe23d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1ae632e-4cf1-4552-835d-a183c94ebdfc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7f933537-dfd2-407d-a523-ec45187c75fc'}, 'image_ref': '7f933537-dfd2-407d-a523-ec45187c75fc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:ad:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeb61cc86-c8'}, 'message_id': 'b932c31a-c78b-11f0-abcb-fa163e0ac78a', 'monotonic_time': 3794.684468051, 'message_signature': 'fb26d37167091396350dad5693ee2b8ca1f49b8c2f4b8a141cbe9e5a0765dbf7'}]}, 'timestamp': '2025-11-22 10:12:16.997964', '_unique_id': '85a4a9ea1d444088873f2ec16610d75d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 10:12:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:12:16.998 12 ERROR oslo_messaging.notify.messaging 
Nov 22 10:12:17 compute-0 nova_compute[186981]: 2025-11-22 10:12:17.141 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:17 compute-0 nova_compute[186981]: 2025-11-22 10:12:17.141 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquired lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:17 compute-0 nova_compute[186981]: 2025-11-22 10:12:17.142 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 10:12:17 compute-0 nova_compute[186981]: 2025-11-22 10:12:17.142 186985 DEBUG nova.objects.instance [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:12:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:17.941 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:17.942 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:17.942 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.463 186985 DEBUG nova.network.neutron [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.490 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Releasing lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.490 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.490 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.491 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.618 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.619 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.619 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.619 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.692 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:18 compute-0 podman[218923]: 2025-11-22 10:12:18.72685115 +0000 UTC m=+0.061090705 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.750 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.751 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:18 compute-0 podman[218921]: 2025-11-22 10:12:18.76652564 +0000 UTC m=+0.093522577 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.806 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.812 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.863 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.864 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:18 compute-0 nova_compute[186981]: 2025-11-22 10:12:18.916 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.084 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.086 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5391MB free_disk=73.40082931518555GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.086 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.086 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.256 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.256 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance 17c3ed36-93e9-413b-ad7e-15f77d2951f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.256 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.256 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.406 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.426 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.453 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.453 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:19 compute-0 nova_compute[186981]: 2025-11-22 10:12:19.822 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:20 compute-0 nova_compute[186981]: 2025-11-22 10:12:20.454 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:20 compute-0 nova_compute[186981]: 2025-11-22 10:12:20.455 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:20 compute-0 nova_compute[186981]: 2025-11-22 10:12:20.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:20 compute-0 nova_compute[186981]: 2025-11-22 10:12:20.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:12:21 compute-0 nova_compute[186981]: 2025-11-22 10:12:21.107 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:21 compute-0 nova_compute[186981]: 2025-11-22 10:12:21.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:22 compute-0 nova_compute[186981]: 2025-11-22 10:12:22.602 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:23 compute-0 nova_compute[186981]: 2025-11-22 10:12:23.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:24 compute-0 nova_compute[186981]: 2025-11-22 10:12:24.316 186985 INFO nova.compute.manager [None req-44ac61fd-08a9-444f-99e3-fbcbb87bed99 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Get console output
Nov 22 10:12:24 compute-0 nova_compute[186981]: 2025-11-22 10:12:24.320 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:12:24 compute-0 nova_compute[186981]: 2025-11-22 10:12:24.824 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:25 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:25.384 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:12:25 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:25.385 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.432 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.540 186985 DEBUG nova.compute.manager [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.541 186985 DEBUG nova.compute.manager [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing instance network info cache due to event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.541 186985 DEBUG oslo_concurrency.lockutils [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.541 186985 DEBUG oslo_concurrency.lockutils [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.542 186985 DEBUG nova.network.neutron [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.574 186985 DEBUG nova.compute.manager [req-cb8e589a-6014-4c8c-b8ae-1a0e604e3a06 req-22e4508e-0942-452c-a1e0-cccec59abc9e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-unplugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.574 186985 DEBUG oslo_concurrency.lockutils [req-cb8e589a-6014-4c8c-b8ae-1a0e604e3a06 req-22e4508e-0942-452c-a1e0-cccec59abc9e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.574 186985 DEBUG oslo_concurrency.lockutils [req-cb8e589a-6014-4c8c-b8ae-1a0e604e3a06 req-22e4508e-0942-452c-a1e0-cccec59abc9e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.575 186985 DEBUG oslo_concurrency.lockutils [req-cb8e589a-6014-4c8c-b8ae-1a0e604e3a06 req-22e4508e-0942-452c-a1e0-cccec59abc9e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.575 186985 DEBUG nova.compute.manager [req-cb8e589a-6014-4c8c-b8ae-1a0e604e3a06 req-22e4508e-0942-452c-a1e0-cccec59abc9e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] No waiting events found dispatching network-vif-unplugged-8477606b-1e0e-478b-b3f5-5851cacc8594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:25 compute-0 nova_compute[186981]: 2025-11-22 10:12:25.575 186985 WARNING nova.compute.manager [req-cb8e589a-6014-4c8c-b8ae-1a0e604e3a06 req-22e4508e-0942-452c-a1e0-cccec59abc9e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received unexpected event network-vif-unplugged-8477606b-1e0e-478b-b3f5-5851cacc8594 for instance with vm_state active and task_state None.
Nov 22 10:12:26 compute-0 nova_compute[186981]: 2025-11-22 10:12:26.110 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:26 compute-0 nova_compute[186981]: 2025-11-22 10:12:26.562 186985 INFO nova.compute.manager [None req-0caccf1f-1117-48af-9c58-f42da7b80730 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Get console output
Nov 22 10:12:26 compute-0 nova_compute[186981]: 2025-11-22 10:12:26.567 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.508 186985 DEBUG nova.network.neutron [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updated VIF entry in instance network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.509 186985 DEBUG nova.network.neutron [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.527 186985 DEBUG oslo_concurrency.lockutils [req-5d711bbd-ecf8-42a2-abbf-937e99c37595 req-d3d479a5-a719-4e67-87b9-cf08ee05383a 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.694 186985 DEBUG nova.compute.manager [req-08404e1d-1af5-4d65-9423-bea7d0a75e20 req-b72845c2-09ab-41b8-ae01-0c02c3bc5371 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.694 186985 DEBUG oslo_concurrency.lockutils [req-08404e1d-1af5-4d65-9423-bea7d0a75e20 req-b72845c2-09ab-41b8-ae01-0c02c3bc5371 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.694 186985 DEBUG oslo_concurrency.lockutils [req-08404e1d-1af5-4d65-9423-bea7d0a75e20 req-b72845c2-09ab-41b8-ae01-0c02c3bc5371 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.695 186985 DEBUG oslo_concurrency.lockutils [req-08404e1d-1af5-4d65-9423-bea7d0a75e20 req-b72845c2-09ab-41b8-ae01-0c02c3bc5371 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.695 186985 DEBUG nova.compute.manager [req-08404e1d-1af5-4d65-9423-bea7d0a75e20 req-b72845c2-09ab-41b8-ae01-0c02c3bc5371 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] No waiting events found dispatching network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:27 compute-0 nova_compute[186981]: 2025-11-22 10:12:27.695 186985 WARNING nova.compute.manager [req-08404e1d-1af5-4d65-9423-bea7d0a75e20 req-b72845c2-09ab-41b8-ae01-0c02c3bc5371 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received unexpected event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 for instance with vm_state active and task_state None.
Nov 22 10:12:28 compute-0 nova_compute[186981]: 2025-11-22 10:12:28.606 186985 INFO nova.compute.manager [None req-4de24985-5464-4da8-be3d-36049dcced9b fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Get console output
Nov 22 10:12:28 compute-0 nova_compute[186981]: 2025-11-22 10:12:28.610 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.813 186985 DEBUG nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.814 186985 DEBUG nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing instance network info cache due to event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.814 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.814 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.814 186985 DEBUG nova.network.neutron [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.827 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.851 186985 DEBUG nova.compute.manager [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-changed-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.852 186985 DEBUG nova.compute.manager [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Refreshing instance network info cache due to event network-changed-eb61cc86-c8e0-4eda-a84a-3d65295b0944. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.852 186985 DEBUG oslo_concurrency.lockutils [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.853 186985 DEBUG oslo_concurrency.lockutils [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.853 186985 DEBUG nova.network.neutron [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Refreshing network info cache for port eb61cc86-c8e0-4eda-a84a-3d65295b0944 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.986 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.986 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.987 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.987 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.988 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.989 186985 INFO nova.compute.manager [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Terminating instance
Nov 22 10:12:29 compute-0 nova_compute[186981]: 2025-11-22 10:12:29.990 186985 DEBUG nova.compute.manager [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:12:30 compute-0 kernel: tapeb61cc86-c8 (unregistering): left promiscuous mode
Nov 22 10:12:30 compute-0 NetworkManager[55425]: <info>  [1763806350.0148] device (tapeb61cc86-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.023 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_controller[95329]: 2025-11-22T10:12:30Z|00155|binding|INFO|Releasing lport eb61cc86-c8e0-4eda-a84a-3d65295b0944 from this chassis (sb_readonly=0)
Nov 22 10:12:30 compute-0 ovn_controller[95329]: 2025-11-22T10:12:30Z|00156|binding|INFO|Setting lport eb61cc86-c8e0-4eda-a84a-3d65295b0944 down in Southbound
Nov 22 10:12:30 compute-0 ovn_controller[95329]: 2025-11-22T10:12:30Z|00157|binding|INFO|Removing iface tapeb61cc86-c8 ovn-installed in OVS
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.026 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.033 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ad:c3 10.100.0.5'], port_security=['fa:16:3e:a4:ad:c3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bcc47b5-14ed-4281-bc3d-05f871760286', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c7fdb23-f90b-44b5-b277-8b2cd6211afa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4de6700b-c3c2-42de-95b4-e4178e78410b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=eb61cc86-c8e0-4eda-a84a-3d65295b0944) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.035 104216 INFO neutron.agent.ovn.metadata.agent [-] Port eb61cc86-c8e0-4eda-a84a-3d65295b0944 in datapath 3bcc47b5-14ed-4281-bc3d-05f871760286 unbound from our chassis
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.037 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3bcc47b5-14ed-4281-bc3d-05f871760286
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.047 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.055 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[23d16ec7-2695-4de9-a2f8-34b94b63ed24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 22 10:12:30 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 13.189s CPU time.
Nov 22 10:12:30 compute-0 systemd-machined[153303]: Machine qemu-12-instance-0000000c terminated.
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.086 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[d292a880-54ff-4262-b8c2-6216d5a955d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.089 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8d0a66-6cbc-4300-bac3-3105c003be0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 podman[218981]: 2025-11-22 10:12:30.110240484 +0000 UTC m=+0.061154901 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.116 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b99edf-3a7a-4296-90bd-c677e6a8fc59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.134 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad64ecd-a8fc-4553-8175-e5c4978a936e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bcc47b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:13:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375828, 'reachable_time': 44793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219015, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.150 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[78e10b11-08ca-4479-9e16-bd13ef089215]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375839, 'tstamp': 375839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219016, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375842, 'tstamp': 375842}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219016, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.151 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bcc47b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.153 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.157 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.158 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bcc47b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.158 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.158 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3bcc47b5-10, col_values=(('external_ids', {'iface-id': 'f00269a4-e7d1-47d4-b0a8-3ef04c233d4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.159 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:30 compute-0 kernel: tapeb61cc86-c8: entered promiscuous mode
Nov 22 10:12:30 compute-0 NetworkManager[55425]: <info>  [1763806350.2067] manager: (tapeb61cc86-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Nov 22 10:12:30 compute-0 kernel: tapeb61cc86-c8 (unregistering): left promiscuous mode
Nov 22 10:12:30 compute-0 ovn_controller[95329]: 2025-11-22T10:12:30Z|00158|binding|INFO|Claiming lport eb61cc86-c8e0-4eda-a84a-3d65295b0944 for this chassis.
Nov 22 10:12:30 compute-0 ovn_controller[95329]: 2025-11-22T10:12:30Z|00159|binding|INFO|eb61cc86-c8e0-4eda-a84a-3d65295b0944: Claiming fa:16:3e:a4:ad:c3 10.100.0.5
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.212 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.220 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ad:c3 10.100.0.5'], port_security=['fa:16:3e:a4:ad:c3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bcc47b5-14ed-4281-bc3d-05f871760286', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c7fdb23-f90b-44b5-b277-8b2cd6211afa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4de6700b-c3c2-42de-95b4-e4178e78410b, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=eb61cc86-c8e0-4eda-a84a-3d65295b0944) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.221 104216 INFO neutron.agent.ovn.metadata.agent [-] Port eb61cc86-c8e0-4eda-a84a-3d65295b0944 in datapath 3bcc47b5-14ed-4281-bc3d-05f871760286 bound to our chassis
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.222 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3bcc47b5-14ed-4281-bc3d-05f871760286
Nov 22 10:12:30 compute-0 ovn_controller[95329]: 2025-11-22T10:12:30Z|00160|binding|INFO|Releasing lport eb61cc86-c8e0-4eda-a84a-3d65295b0944 from this chassis (sb_readonly=0)
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.225 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.233 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ad:c3 10.100.0.5'], port_security=['fa:16:3e:a4:ad:c3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '17c3ed36-93e9-413b-ad7e-15f77d2951f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bcc47b5-14ed-4281-bc3d-05f871760286', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c7fdb23-f90b-44b5-b277-8b2cd6211afa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4de6700b-c3c2-42de-95b4-e4178e78410b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=eb61cc86-c8e0-4eda-a84a-3d65295b0944) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.244 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7c2259-95f0-4e14-8f34-25c882e7b063]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.275 186985 INFO nova.virt.libvirt.driver [-] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Instance destroyed successfully.
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.276 186985 DEBUG nova.objects.instance [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 17c3ed36-93e9-413b-ad7e-15f77d2951f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.276 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[b2faa7eb-9765-442d-9fcc-33616d1a0c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.279 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[f6947d68-a70f-413f-8bcc-dc2bd433de6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.297 186985 DEBUG nova.virt.libvirt.vif [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:11:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-867812263',display_name='tempest-TestNetworkBasicOps-server-867812263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-867812263',id=12,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNiGzHVfNRblYz4Oj3U14Lxniim9sMmEJRIjLtwT8EFdvSQjQNjZXAeN/EbWFjndF8YRi51URdR2HBrMGZTwNn7ahCwB6paEmxUEalYL9hIE0q5QlinSiU0G2FyPmsPVkQ==',key_name='tempest-TestNetworkBasicOps-1075170225',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:12:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-d4kmj3e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:12:03Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=17c3ed36-93e9-413b-ad7e-15f77d2951f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.298 186985 DEBUG nova.network.os_vif_util [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.299 186985 DEBUG nova.network.os_vif_util [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a4:ad:c3,bridge_name='br-int',has_traffic_filtering=True,id=eb61cc86-c8e0-4eda-a84a-3d65295b0944,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb61cc86-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.299 186985 DEBUG os_vif [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:ad:c3,bridge_name='br-int',has_traffic_filtering=True,id=eb61cc86-c8e0-4eda-a84a-3d65295b0944,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb61cc86-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.301 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.302 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb61cc86-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.303 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.309 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.309 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[62d64fd4-556a-4ff4-aa52-81669f5cdd5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.313 186985 INFO os_vif [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:ad:c3,bridge_name='br-int',has_traffic_filtering=True,id=eb61cc86-c8e0-4eda-a84a-3d65295b0944,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb61cc86-c8')
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.314 186985 INFO nova.virt.libvirt.driver [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Deleting instance files /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1_del
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.315 186985 INFO nova.virt.libvirt.driver [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Deletion of /var/lib/nova/instances/17c3ed36-93e9-413b-ad7e-15f77d2951f1_del complete
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.336 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2377f7cd-9494-4753-8895-b7483920b001]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bcc47b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:13:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375828, 'reachable_time': 44793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219037, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.352 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8df5cf83-27fa-45be-b43a-ba5b97a664b4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375839, 'tstamp': 375839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219038, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375842, 'tstamp': 375842}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219038, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.354 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bcc47b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.355 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.356 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.357 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bcc47b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.357 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.357 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3bcc47b5-10, col_values=(('external_ids', {'iface-id': 'f00269a4-e7d1-47d4-b0a8-3ef04c233d4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.357 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.358 104216 INFO neutron.agent.ovn.metadata.agent [-] Port eb61cc86-c8e0-4eda-a84a-3d65295b0944 in datapath 3bcc47b5-14ed-4281-bc3d-05f871760286 unbound from our chassis
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.359 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3bcc47b5-14ed-4281-bc3d-05f871760286
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.373 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[1e818cbd-cf73-4303-bb34-7b7c42d89959]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.385 186985 INFO nova.compute.manager [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.385 186985 DEBUG oslo.service.loopingcall [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.386 186985 DEBUG nova.compute.manager [-] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.386 186985 DEBUG nova.network.neutron [-] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.408 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c3b913-5345-4b1a-ad94-9505dadec690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.410 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe99550-4047-46fb-8916-6299ea8da8df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.448 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6f3337-014e-44aa-aa00-15979e02d61f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.470 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5cb8629f-4640-4b89-8c93-df60b05e15c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bcc47b5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:13:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375828, 'reachable_time': 44793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219044, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.489 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[43858a3d-3437-4956-86a6-5dc842e50f5b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375839, 'tstamp': 375839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219045, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3bcc47b5-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375842, 'tstamp': 375842}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219045, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.491 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bcc47b5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.492 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.495 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.496 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bcc47b5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.496 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.497 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3bcc47b5-10, col_values=(('external_ids', {'iface-id': 'f00269a4-e7d1-47d4-b0a8-3ef04c233d4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:30 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:30.498 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.889 186985 DEBUG nova.network.neutron [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updated VIF entry in instance network info cache for port eb61cc86-c8e0-4eda-a84a-3d65295b0944. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.890 186985 DEBUG nova.network.neutron [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updating instance_info_cache with network_info: [{"id": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "address": "fa:16:3e:a4:ad:c3", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb61cc86-c8", "ovs_interfaceid": "eb61cc86-c8e0-4eda-a84a-3d65295b0944", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.913 186985 DEBUG oslo_concurrency.lockutils [req-bdbd2284-f6e8-437e-a207-565bcbe7718f req-eb8314e9-18dc-4afd-bc0a-a4b7d02d0a14 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-17c3ed36-93e9-413b-ad7e-15f77d2951f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.922 186985 DEBUG nova.network.neutron [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updated VIF entry in instance network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.923 186985 DEBUG nova.network.neutron [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.939 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.939 186985 DEBUG nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.940 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.940 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.940 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.941 186985 DEBUG nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] No waiting events found dispatching network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.941 186985 WARNING nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received unexpected event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 for instance with vm_state active and task_state None.
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.941 186985 DEBUG nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.941 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.942 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.942 186985 DEBUG oslo_concurrency.lockutils [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.942 186985 DEBUG nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] No waiting events found dispatching network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:30 compute-0 nova_compute[186981]: 2025-11-22 10:12:30.942 186985 WARNING nova.compute.manager [req-6a7afc26-2a97-48a1-ba07-fe7c8aa0d65b req-3ce6deb4-692d-4467-b28f-62f30630672e 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received unexpected event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 for instance with vm_state active and task_state None.
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.113 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.174 186985 DEBUG nova.network.neutron [-] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.195 186985 INFO nova.compute.manager [-] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Took 0.81 seconds to deallocate network for instance.
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.255 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.255 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.330 186985 DEBUG nova.compute.provider_tree [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.348 186985 DEBUG nova.scheduler.client.report [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.367 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.397 186985 INFO nova.scheduler.client.report [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 17c3ed36-93e9-413b-ad7e-15f77d2951f1
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.480 186985 DEBUG oslo_concurrency.lockutils [None req-c7515c2d-6df9-4e3e-aa68-1f4b0a1c4a3a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.975 186985 DEBUG nova.compute.manager [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-vif-unplugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.975 186985 DEBUG oslo_concurrency.lockutils [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.976 186985 DEBUG oslo_concurrency.lockutils [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.976 186985 DEBUG oslo_concurrency.lockutils [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.977 186985 DEBUG nova.compute.manager [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] No waiting events found dispatching network-vif-unplugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.978 186985 WARNING nova.compute.manager [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received unexpected event network-vif-unplugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 for instance with vm_state deleted and task_state None.
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.978 186985 DEBUG nova.compute.manager [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.978 186985 DEBUG oslo_concurrency.lockutils [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.979 186985 DEBUG oslo_concurrency.lockutils [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.979 186985 DEBUG oslo_concurrency.lockutils [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "17c3ed36-93e9-413b-ad7e-15f77d2951f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.980 186985 DEBUG nova.compute.manager [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] No waiting events found dispatching network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:31 compute-0 nova_compute[186981]: 2025-11-22 10:12:31.980 186985 WARNING nova.compute.manager [req-059a7e8b-9bc1-4107-bc73-1480496c9b74 req-4c68e9d3-24a6-4049-b1ba-dd92c0a8cb76 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received unexpected event network-vif-plugged-eb61cc86-c8e0-4eda-a84a-3d65295b0944 for instance with vm_state deleted and task_state None.
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.061 186985 DEBUG nova.compute.manager [req-c3e02a37-0439-4a13-9db0-7febde35e441 req-2109cf4f-da8d-49c8-a1c2-60d129ad98a7 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Received event network-vif-deleted-eb61cc86-c8e0-4eda-a84a-3d65295b0944 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.729 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.729 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.730 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.730 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.730 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.731 186985 INFO nova.compute.manager [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Terminating instance
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.732 186985 DEBUG nova.compute.manager [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:12:32 compute-0 kernel: tap8477606b-1e (unregistering): left promiscuous mode
Nov 22 10:12:32 compute-0 NetworkManager[55425]: <info>  [1763806352.7619] device (tap8477606b-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:12:32 compute-0 ovn_controller[95329]: 2025-11-22T10:12:32Z|00161|binding|INFO|Releasing lport 8477606b-1e0e-478b-b3f5-5851cacc8594 from this chassis (sb_readonly=0)
Nov 22 10:12:32 compute-0 ovn_controller[95329]: 2025-11-22T10:12:32Z|00162|binding|INFO|Setting lport 8477606b-1e0e-478b-b3f5-5851cacc8594 down in Southbound
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.767 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:32 compute-0 ovn_controller[95329]: 2025-11-22T10:12:32Z|00163|binding|INFO|Removing iface tap8477606b-1e ovn-installed in OVS
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.770 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:32.775 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:04:3e 10.100.0.13'], port_security=['fa:16:3e:e9:04:3e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0cc84ed6-e43a-4e94-8e2e-5a057bbfee73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bcc47b5-14ed-4281-bc3d-05f871760286', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e465d698-ac20-4ffb-95b2-d7abfb45d591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4de6700b-c3c2-42de-95b4-e4178e78410b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=8477606b-1e0e-478b-b3f5-5851cacc8594) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:12:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:32.776 104216 INFO neutron.agent.ovn.metadata.agent [-] Port 8477606b-1e0e-478b-b3f5-5851cacc8594 in datapath 3bcc47b5-14ed-4281-bc3d-05f871760286 unbound from our chassis
Nov 22 10:12:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:32.777 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3bcc47b5-14ed-4281-bc3d-05f871760286, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:12:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:32.778 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d82eb730-8416-469e-872a-14928f0918fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:32.778 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286 namespace which is not needed anymore
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.784 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:32 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 22 10:12:32 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 14.066s CPU time.
Nov 22 10:12:32 compute-0 systemd-machined[153303]: Machine qemu-11-instance-0000000b terminated.
Nov 22 10:12:32 compute-0 neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286[218633]: [NOTICE]   (218637) : haproxy version is 2.8.14-c23fe91
Nov 22 10:12:32 compute-0 neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286[218633]: [NOTICE]   (218637) : path to executable is /usr/sbin/haproxy
Nov 22 10:12:32 compute-0 neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286[218633]: [WARNING]  (218637) : Exiting Master process...
Nov 22 10:12:32 compute-0 neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286[218633]: [ALERT]    (218637) : Current worker (218639) exited with code 143 (Terminated)
Nov 22 10:12:32 compute-0 neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286[218633]: [WARNING]  (218637) : All workers exited. Exiting... (0)
Nov 22 10:12:32 compute-0 systemd[1]: libpod-5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8.scope: Deactivated successfully.
Nov 22 10:12:32 compute-0 podman[219069]: 2025-11-22 10:12:32.925383569 +0000 UTC m=+0.049228358 container died 5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 10:12:32 compute-0 NetworkManager[55425]: <info>  [1763806352.9484] manager: (tap8477606b-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 22 10:12:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8-userdata-shm.mount: Deactivated successfully.
Nov 22 10:12:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-cce43ac059813df9444c6c775500ca10227a2c8f1ae0f7d49b639fd1ba19966f-merged.mount: Deactivated successfully.
Nov 22 10:12:32 compute-0 podman[219069]: 2025-11-22 10:12:32.964493921 +0000 UTC m=+0.088338730 container cleanup 5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:12:32 compute-0 systemd[1]: libpod-conmon-5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8.scope: Deactivated successfully.
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.986 186985 INFO nova.virt.libvirt.driver [-] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Instance destroyed successfully.
Nov 22 10:12:32 compute-0 nova_compute[186981]: 2025-11-22 10:12:32.986 186985 DEBUG nova.objects.instance [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.000 186985 DEBUG nova.virt.libvirt.vif [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:11:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-432033049',display_name='tempest-TestNetworkBasicOps-server-432033049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-432033049',id=11,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFMjT3SILCckN7hccmQPdQJZ/KxZaeTzO5FvYoEKS1evzYdiPtDC27AgzmpjzTkQ0fm10422f6oVjdCb6vftsFGdHE/l6y7M018xvotYzDwfn0yofl/oqZm0j4BRjxoNXw==',key_name='tempest-TestNetworkBasicOps-397061661',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:11:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-nrdivubi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:11:42Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=0cc84ed6-e43a-4e94-8e2e-5a057bbfee73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.000 186985 DEBUG nova.network.os_vif_util [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.001 186985 DEBUG nova.network.os_vif_util [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:04:3e,bridge_name='br-int',has_traffic_filtering=True,id=8477606b-1e0e-478b-b3f5-5851cacc8594,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8477606b-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.001 186985 DEBUG os_vif [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:04:3e,bridge_name='br-int',has_traffic_filtering=True,id=8477606b-1e0e-478b-b3f5-5851cacc8594,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8477606b-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.002 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.002 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8477606b-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.003 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.005 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.007 186985 INFO os_vif [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:04:3e,bridge_name='br-int',has_traffic_filtering=True,id=8477606b-1e0e-478b-b3f5-5851cacc8594,network=Network(3bcc47b5-14ed-4281-bc3d-05f871760286),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8477606b-1e')
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.007 186985 INFO nova.virt.libvirt.driver [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Deleting instance files /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73_del
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.008 186985 INFO nova.virt.libvirt.driver [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Deletion of /var/lib/nova/instances/0cc84ed6-e43a-4e94-8e2e-5a057bbfee73_del complete
Nov 22 10:12:33 compute-0 podman[219112]: 2025-11-22 10:12:33.025412296 +0000 UTC m=+0.038426305 container remove 5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.030 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc24ac7-a0c5-46e6-bf93-fb2fec196fb4]: (4, ('Sat Nov 22 10:12:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286 (5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8)\n5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8\nSat Nov 22 10:12:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286 (5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8)\n5901aa192b924a55fa00d5401509d44a4c1e877b61052e2fed0ee318506957b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.032 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6fac4a-b3f4-41c3-95e9-1138dd956776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.032 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bcc47b5-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.034 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:33 compute-0 kernel: tap3bcc47b5-10: left promiscuous mode
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.045 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.046 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.049 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[3d68054d-e1a4-4032-b852-3880b90a301a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.055 186985 INFO nova.compute.manager [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Took 0.32 seconds to destroy the instance on the hypervisor.
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.055 186985 DEBUG oslo.service.loopingcall [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.056 186985 DEBUG nova.compute.manager [-] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:12:33 compute-0 nova_compute[186981]: 2025-11-22 10:12:33.056 186985 DEBUG nova.network.neutron [-] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.068 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9e91ef-83ce-4391-b2de-f568a5ecdc27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.069 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[84d30302-5b19-4119-91d4-7244e7c2701a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.083 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[e0aaca80-bc25-480a-af70-af0304b0af84]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375821, 'reachable_time': 35262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219124, 'error': None, 'target': 'ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:33 compute-0 systemd[1]: run-netns-ovnmeta\x2d3bcc47b5\x2d14ed\x2d4281\x2dbc3d\x2d05f871760286.mount: Deactivated successfully.
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.086 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3bcc47b5-14ed-4281-bc3d-05f871760286 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:12:33 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:33.086 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[8f073551-3b76-41d6-b001-a62f37ecf113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.085 186985 DEBUG nova.compute.manager [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-unplugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.086 186985 DEBUG oslo_concurrency.lockutils [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.086 186985 DEBUG oslo_concurrency.lockutils [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.087 186985 DEBUG oslo_concurrency.lockutils [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.087 186985 DEBUG nova.compute.manager [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] No waiting events found dispatching network-vif-unplugged-8477606b-1e0e-478b-b3f5-5851cacc8594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.088 186985 DEBUG nova.compute.manager [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-unplugged-8477606b-1e0e-478b-b3f5-5851cacc8594 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.088 186985 DEBUG nova.compute.manager [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.088 186985 DEBUG oslo_concurrency.lockutils [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.089 186985 DEBUG oslo_concurrency.lockutils [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.089 186985 DEBUG oslo_concurrency.lockutils [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.090 186985 DEBUG nova.compute.manager [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] No waiting events found dispatching network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.090 186985 WARNING nova.compute.manager [req-a04de021-0dbe-42bb-9d6c-b38dbe74fc53 req-5b89918d-f8b0-4416-a367-c67a333835bd 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received unexpected event network-vif-plugged-8477606b-1e0e-478b-b3f5-5851cacc8594 for instance with vm_state active and task_state deleting.
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.187 186985 DEBUG nova.compute.manager [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.187 186985 DEBUG nova.compute.manager [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing instance network info cache due to event network-changed-8477606b-1e0e-478b-b3f5-5851cacc8594. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.187 186985 DEBUG oslo_concurrency.lockutils [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.187 186985 DEBUG oslo_concurrency.lockutils [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:34 compute-0 nova_compute[186981]: 2025-11-22 10:12:34.188 186985 DEBUG nova.network.neutron [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Refreshing network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:12:34 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:12:34.388 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.140 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.359 186985 DEBUG nova.network.neutron [-] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.376 186985 INFO nova.compute.manager [-] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Took 3.32 seconds to deallocate network for instance.
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.421 186985 DEBUG nova.compute.manager [req-7a64a1d0-5178-492d-8bc9-18ada0ee1581 req-8a4c029e-3623-432e-aced-bf85dbf2599b 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Received event network-vif-deleted-8477606b-1e0e-478b-b3f5-5851cacc8594 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.425 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.426 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.483 186985 DEBUG nova.compute.provider_tree [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.499 186985 DEBUG nova.scheduler.client.report [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.524 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.548 186985 INFO nova.scheduler.client.report [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.611 186985 DEBUG oslo_concurrency.lockutils [None req-efb30212-073b-4a68-b0bf-1722ce8b00ab fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.842 186985 DEBUG nova.network.neutron [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updated VIF entry in instance network info cache for port 8477606b-1e0e-478b-b3f5-5851cacc8594. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.843 186985 DEBUG nova.network.neutron [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Updating instance_info_cache with network_info: [{"id": "8477606b-1e0e-478b-b3f5-5851cacc8594", "address": "fa:16:3e:e9:04:3e", "network": {"id": "3bcc47b5-14ed-4281-bc3d-05f871760286", "bridge": "br-int", "label": "tempest-network-smoke--596176681", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8477606b-1e", "ovs_interfaceid": "8477606b-1e0e-478b-b3f5-5851cacc8594", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:36 compute-0 nova_compute[186981]: 2025-11-22 10:12:36.863 186985 DEBUG oslo_concurrency.lockutils [req-944db2c8-31dc-41b7-982d-40cd162a8c1e req-00b59dba-d164-44aa-a7a1-0dd12ed2b229 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-0cc84ed6-e43a-4e94-8e2e-5a057bbfee73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:38 compute-0 nova_compute[186981]: 2025-11-22 10:12:38.003 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:40 compute-0 podman[219129]: 2025-11-22 10:12:40.62762665 +0000 UTC m=+0.074988188 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 10:12:40 compute-0 podman[219130]: 2025-11-22 10:12:40.63315637 +0000 UTC m=+0.088499795 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:12:40 compute-0 nova_compute[186981]: 2025-11-22 10:12:40.691 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:40 compute-0 nova_compute[186981]: 2025-11-22 10:12:40.770 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:41 compute-0 nova_compute[186981]: 2025-11-22 10:12:41.142 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:43 compute-0 nova_compute[186981]: 2025-11-22 10:12:43.070 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:45 compute-0 nova_compute[186981]: 2025-11-22 10:12:45.273 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806350.2720122, 17c3ed36-93e9-413b-ad7e-15f77d2951f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:12:45 compute-0 nova_compute[186981]: 2025-11-22 10:12:45.274 186985 INFO nova.compute.manager [-] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] VM Stopped (Lifecycle Event)
Nov 22 10:12:45 compute-0 nova_compute[186981]: 2025-11-22 10:12:45.299 186985 DEBUG nova.compute.manager [None req-dc190d35-ce15-41e4-8a41-354af4842b66 - - - - - -] [instance: 17c3ed36-93e9-413b-ad7e-15f77d2951f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:12:46 compute-0 nova_compute[186981]: 2025-11-22 10:12:46.145 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:47 compute-0 podman[219177]: 2025-11-22 10:12:47.630065132 +0000 UTC m=+0.069095937 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 10:12:47 compute-0 podman[219176]: 2025-11-22 10:12:47.654159777 +0000 UTC m=+0.098429914 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 10:12:47 compute-0 nova_compute[186981]: 2025-11-22 10:12:47.984 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806352.9832244, 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:12:47 compute-0 nova_compute[186981]: 2025-11-22 10:12:47.984 186985 INFO nova.compute.manager [-] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] VM Stopped (Lifecycle Event)
Nov 22 10:12:48 compute-0 nova_compute[186981]: 2025-11-22 10:12:48.051 186985 DEBUG nova.compute.manager [None req-62d1cd60-817a-437d-94cf-7d534c4e1e76 - - - - - -] [instance: 0cc84ed6-e43a-4e94-8e2e-5a057bbfee73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:12:48 compute-0 nova_compute[186981]: 2025-11-22 10:12:48.072 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:49 compute-0 podman[219215]: 2025-11-22 10:12:49.629433922 +0000 UTC m=+0.075100670 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:12:49 compute-0 podman[219216]: 2025-11-22 10:12:49.653800124 +0000 UTC m=+0.093104969 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 10:12:51 compute-0 nova_compute[186981]: 2025-11-22 10:12:51.148 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:53 compute-0 nova_compute[186981]: 2025-11-22 10:12:53.124 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.496 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.496 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.521 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.605 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.605 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.613 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.613 186985 INFO nova.compute.claims [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Claim successful on node compute-0.ctlplane.example.com
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.713 186985 DEBUG nova.compute.provider_tree [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.731 186985 DEBUG nova.scheduler.client.report [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.757 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.757 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.804 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.804 186985 DEBUG nova.network.neutron [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.827 186985 INFO nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.845 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.932 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.933 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.934 186985 INFO nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Creating image(s)
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.934 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "/var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.935 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.935 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "/var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:55 compute-0 nova_compute[186981]: 2025-11-22 10:12:55.948 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.012 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.013 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "165ece4b017b704455dfc2c97897af8403d1c3eb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.014 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.025 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.076 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.077 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.112 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb,backing_fmt=raw /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.113 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "165ece4b017b704455dfc2c97897af8403d1c3eb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.113 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.166 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/165ece4b017b704455dfc2c97897af8403d1c3eb --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.167 186985 DEBUG nova.virt.disk.api [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Checking if we can resize image /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.167 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.180 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.184 186985 DEBUG nova.policy [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd88a700663e44618f0a22f234573806', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.219 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.220 186985 DEBUG nova.virt.disk.api [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Cannot resize image /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.220 186985 DEBUG nova.objects.instance [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'migration_context' on Instance uuid ea3ef3d3-b413-4626-b3a0-e09ab809e661 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.233 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.233 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Ensure instance console log exists: /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.234 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.234 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.234 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:56 compute-0 nova_compute[186981]: 2025-11-22 10:12:56.930 186985 DEBUG nova.network.neutron [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Successfully created port: c6c9c6b8-8279-46c9-839c-b2f5011e57b6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 10:12:57 compute-0 nova_compute[186981]: 2025-11-22 10:12:57.779 186985 DEBUG nova.network.neutron [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Successfully updated port: c6c9c6b8-8279-46c9-839c-b2f5011e57b6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 10:12:57 compute-0 nova_compute[186981]: 2025-11-22 10:12:57.796 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:57 compute-0 nova_compute[186981]: 2025-11-22 10:12:57.797 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquired lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:57 compute-0 nova_compute[186981]: 2025-11-22 10:12:57.798 186985 DEBUG nova.network.neutron [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 10:12:57 compute-0 nova_compute[186981]: 2025-11-22 10:12:57.882 186985 DEBUG nova.compute.manager [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-changed-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:12:57 compute-0 nova_compute[186981]: 2025-11-22 10:12:57.882 186985 DEBUG nova.compute.manager [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Refreshing instance network info cache due to event network-changed-c6c9c6b8-8279-46c9-839c-b2f5011e57b6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:12:57 compute-0 nova_compute[186981]: 2025-11-22 10:12:57.883 186985 DEBUG oslo_concurrency.lockutils [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:12:58 compute-0 nova_compute[186981]: 2025-11-22 10:12:58.126 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:58 compute-0 nova_compute[186981]: 2025-11-22 10:12:58.175 186985 DEBUG nova.network.neutron [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.225 186985 DEBUG nova.network.neutron [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updating instance_info_cache with network_info: [{"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.257 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Releasing lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.257 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Instance network_info: |[{"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.258 186985 DEBUG oslo_concurrency.lockutils [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.259 186985 DEBUG nova.network.neutron [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Refreshing network info cache for port c6c9c6b8-8279-46c9-839c-b2f5011e57b6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.263 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Start _get_guest_xml network_info=[{"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '7f933537-dfd2-407d-a523-ec45187c75fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.269 186985 WARNING nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.281 186985 DEBUG nova.virt.libvirt.host [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.282 186985 DEBUG nova.virt.libvirt.host [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.287 186985 DEBUG nova.virt.libvirt.host [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.287 186985 DEBUG nova.virt.libvirt.host [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.288 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.288 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T10:01:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1ae632e-4cf1-4552-835d-a183c94ebdfc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T10:01:51Z,direct_url=<?>,disk_format='qcow2',id=7f933537-dfd2-407d-a523-ec45187c75fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b797995ce7e2414bb591227b83fccf41',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T10:01:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.289 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.290 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.290 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.291 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.291 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.291 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.292 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.292 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.293 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.293 186985 DEBUG nova.virt.hardware [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.299 186985 DEBUG nova.virt.libvirt.vif [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1539712246',display_name='tempest-TestNetworkBasicOps-server-1539712246',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1539712246',id=13,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHtIvCFeVeOxcibzAEUh72Jq9e/4HpHFuC/6IQ8LbkPTGw+dLbWIIBBlwRl5UfcMu3FHMK5uAhqGuizqm/iGctBci2blu6IIlI53gtX+cn4RvAYWnDlzL3TyCMpHp42lw==',key_name='tempest-TestNetworkBasicOps-399070402',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-e0et608g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:12:55Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=ea3ef3d3-b413-4626-b3a0-e09ab809e661,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.299 186985 DEBUG nova.network.os_vif_util [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.301 186985 DEBUG nova.network.os_vif_util [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:0b:cf,bridge_name='br-int',has_traffic_filtering=True,id=c6c9c6b8-8279-46c9-839c-b2f5011e57b6,network=Network(c93da617-8dc2-443c-a250-32e835de0d91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c9c6b8-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.302 186985 DEBUG nova.objects.instance [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid ea3ef3d3-b413-4626-b3a0-e09ab809e661 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.320 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] End _get_guest_xml xml=<domain type="kvm">
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <uuid>ea3ef3d3-b413-4626-b3a0-e09ab809e661</uuid>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <name>instance-0000000d</name>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <memory>131072</memory>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <vcpu>1</vcpu>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <metadata>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <nova:name>tempest-TestNetworkBasicOps-server-1539712246</nova:name>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <nova:creationTime>2025-11-22 10:12:59</nova:creationTime>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <nova:flavor name="m1.nano">
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:memory>128</nova:memory>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:disk>1</nova:disk>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:swap>0</nova:swap>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:vcpus>1</nova:vcpus>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       </nova:flavor>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <nova:owner>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:user uuid="fd88a700663e44618f0a22f234573806">tempest-TestNetworkBasicOps-171376730-project-member</nova:user>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:project uuid="b60c6181ec1c449ab3dd7a45969909f7">tempest-TestNetworkBasicOps-171376730</nova:project>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       </nova:owner>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <nova:root type="image" uuid="7f933537-dfd2-407d-a523-ec45187c75fc"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <nova:ports>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         <nova:port uuid="c6c9c6b8-8279-46c9-839c-b2f5011e57b6">
Nov 22 10:12:59 compute-0 nova_compute[186981]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:         </nova:port>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       </nova:ports>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </nova:instance>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   </metadata>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <sysinfo type="smbios">
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <system>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <entry name="manufacturer">RDO</entry>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <entry name="product">OpenStack Compute</entry>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <entry name="serial">ea3ef3d3-b413-4626-b3a0-e09ab809e661</entry>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <entry name="uuid">ea3ef3d3-b413-4626-b3a0-e09ab809e661</entry>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <entry name="family">Virtual Machine</entry>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </system>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   </sysinfo>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <os>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <boot dev="hd"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <smbios mode="sysinfo"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   </os>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <features>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <acpi/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <apic/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <vmcoreinfo/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   </features>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <clock offset="utc">
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <timer name="hpet" present="no"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   </clock>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <cpu mode="host-model" match="exact">
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   </cpu>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   <devices>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <disk type="file" device="disk">
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <target dev="vda" bus="virtio"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <disk type="file" device="cdrom">
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <source file="/var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk.config"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <target dev="sda" bus="sata"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </disk>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <interface type="ethernet">
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <mac address="fa:16:3e:dc:0b:cf"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <mtu size="1442"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <target dev="tapc6c9c6b8-82"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </interface>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <serial type="pty">
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <log file="/var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/console.log" append="off"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </serial>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <video>
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <model type="virtio"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </video>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <input type="tablet" bus="usb"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <rng model="virtio">
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <backend model="random">/dev/urandom</backend>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </rng>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <controller type="usb" index="0"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     <memballoon model="virtio">
Nov 22 10:12:59 compute-0 nova_compute[186981]:       <stats period="10"/>
Nov 22 10:12:59 compute-0 nova_compute[186981]:     </memballoon>
Nov 22 10:12:59 compute-0 nova_compute[186981]:   </devices>
Nov 22 10:12:59 compute-0 nova_compute[186981]: </domain>
Nov 22 10:12:59 compute-0 nova_compute[186981]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.324 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Preparing to wait for external event network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.324 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.325 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.326 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.327 186985 DEBUG nova.virt.libvirt.vif [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T10:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1539712246',display_name='tempest-TestNetworkBasicOps-server-1539712246',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1539712246',id=13,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHtIvCFeVeOxcibzAEUh72Jq9e/4HpHFuC/6IQ8LbkPTGw+dLbWIIBBlwRl5UfcMu3FHMK5uAhqGuizqm/iGctBci2blu6IIlI53gtX+cn4RvAYWnDlzL3TyCMpHp42lw==',key_name='tempest-TestNetworkBasicOps-399070402',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-e0et608g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T10:12:55Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=ea3ef3d3-b413-4626-b3a0-e09ab809e661,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.328 186985 DEBUG nova.network.os_vif_util [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.329 186985 DEBUG nova.network.os_vif_util [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:0b:cf,bridge_name='br-int',has_traffic_filtering=True,id=c6c9c6b8-8279-46c9-839c-b2f5011e57b6,network=Network(c93da617-8dc2-443c-a250-32e835de0d91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c9c6b8-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.330 186985 DEBUG os_vif [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:0b:cf,bridge_name='br-int',has_traffic_filtering=True,id=c6c9c6b8-8279-46c9-839c-b2f5011e57b6,network=Network(c93da617-8dc2-443c-a250-32e835de0d91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c9c6b8-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.331 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.332 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.333 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.337 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.337 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6c9c6b8-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.338 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6c9c6b8-82, col_values=(('external_ids', {'iface-id': 'c6c9c6b8-8279-46c9-839c-b2f5011e57b6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:0b:cf', 'vm-uuid': 'ea3ef3d3-b413-4626-b3a0-e09ab809e661'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.340 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:59 compute-0 NetworkManager[55425]: <info>  [1763806379.3421] manager: (tapc6c9c6b8-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.343 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.348 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.350 186985 INFO os_vif [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:0b:cf,bridge_name='br-int',has_traffic_filtering=True,id=c6c9c6b8-8279-46c9-839c-b2f5011e57b6,network=Network(c93da617-8dc2-443c-a250-32e835de0d91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c9c6b8-82')
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.417 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.417 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.418 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] No VIF found with MAC fa:16:3e:dc:0b:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.419 186985 INFO nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Using config drive
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.763 186985 INFO nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Creating config drive at /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk.config
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.772 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp88g2_2jy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:12:59 compute-0 nova_compute[186981]: 2025-11-22 10:12:59.912 186985 DEBUG oslo_concurrency.processutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp88g2_2jy" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:12:59 compute-0 kernel: tapc6c9c6b8-82: entered promiscuous mode
Nov 22 10:12:59 compute-0 NetworkManager[55425]: <info>  [1763806379.9813] manager: (tapc6c9c6b8-82): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Nov 22 10:13:00 compute-0 ovn_controller[95329]: 2025-11-22T10:13:00Z|00164|binding|INFO|Claiming lport c6c9c6b8-8279-46c9-839c-b2f5011e57b6 for this chassis.
Nov 22 10:13:00 compute-0 ovn_controller[95329]: 2025-11-22T10:13:00Z|00165|binding|INFO|c6c9c6b8-8279-46c9-839c-b2f5011e57b6: Claiming fa:16:3e:dc:0b:cf 10.100.0.13
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.032 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.039 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.047 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:0b:cf 10.100.0.13'], port_security=['fa:16:3e:dc:0b:cf 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ea3ef3d3-b413-4626-b3a0-e09ab809e661', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c93da617-8dc2-443c-a250-32e835de0d91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '823337f9-8a53-498d-a34e-9c6a37348cb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdd87b40-dcfd-41dd-a2f3-26f18606e8b9, chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=c6c9c6b8-8279-46c9-839c-b2f5011e57b6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.049 104216 INFO neutron.agent.ovn.metadata.agent [-] Port c6c9c6b8-8279-46c9-839c-b2f5011e57b6 in datapath c93da617-8dc2-443c-a250-32e835de0d91 bound to our chassis
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.051 104216 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c93da617-8dc2-443c-a250-32e835de0d91
Nov 22 10:13:00 compute-0 systemd-udevd[219291]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.061 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[69eb2966-5730-41f9-bd18-930b33513d3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.062 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc93da617-81 in ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.064 213484 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc93da617-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.064 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[44eb5206-b351-4b49-ae2c-bf2947ae28eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.065 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[73806820-ed8a-4d3a-9cf5-2cabf3ec6edd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 systemd-machined[153303]: New machine qemu-13-instance-0000000d.
Nov 22 10:13:00 compute-0 NetworkManager[55425]: <info>  [1763806380.0699] device (tapc6c9c6b8-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 10:13:00 compute-0 NetworkManager[55425]: <info>  [1763806380.0709] device (tapc6c9c6b8-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.075 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb00ea7-38df-4e04-9cef-899e529a41ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_controller[95329]: 2025-11-22T10:13:00Z|00166|binding|INFO|Setting lport c6c9c6b8-8279-46c9-839c-b2f5011e57b6 ovn-installed in OVS
Nov 22 10:13:00 compute-0 ovn_controller[95329]: 2025-11-22T10:13:00Z|00167|binding|INFO|Setting lport c6c9c6b8-8279-46c9-839c-b2f5011e57b6 up in Southbound
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.098 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:00 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.106 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[86df9c0d-bd2b-42b8-a0f3-918eb3b20bf9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.141 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[3a9fdef9-8be2-4180-bec2-255557423204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.147 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[83ee1f1e-9a89-484e-879b-8c8a83301b4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 NetworkManager[55425]: <info>  [1763806380.1481] manager: (tapc93da617-80): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.180 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[a625c6d0-6097-4582-ae05-bb318fb66fce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.183 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe95173-da4e-4405-b20e-8a239beae984]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 podman[219299]: 2025-11-22 10:13:00.20998821 +0000 UTC m=+0.060598928 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:13:00 compute-0 NetworkManager[55425]: <info>  [1763806380.2106] device (tapc93da617-80): carrier: link connected
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.215 213545 DEBUG oslo.privsep.daemon [-] privsep: reply[d518aa8d-ae63-4fa5-9f78-2d6ef3fe1e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.230 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c07d56-8c6b-407e-90ca-6cc379da5b1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc93da617-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:7a:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383791, 'reachable_time': 30903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219346, 'error': None, 'target': 'ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.246 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[631e37bc-8a7b-49e6-b85a-360d8835a4ed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:7a1b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383791, 'tstamp': 383791}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219347, 'error': None, 'target': 'ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.261 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf19386-3850-4718-bec7-9437567a2906]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc93da617-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:7a:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383791, 'reachable_time': 30903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219348, 'error': None, 'target': 'ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.294 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d26d5d99-ad10-4914-ab97-6af28f08905a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.339 186985 DEBUG nova.compute.manager [req-e86b8a5e-6919-420c-bcf8-1eaafb5d4192 req-b069ab92-17cb-4262-8274-5fb913b3daea 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.340 186985 DEBUG oslo_concurrency.lockutils [req-e86b8a5e-6919-420c-bcf8-1eaafb5d4192 req-b069ab92-17cb-4262-8274-5fb913b3daea 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.340 186985 DEBUG oslo_concurrency.lockutils [req-e86b8a5e-6919-420c-bcf8-1eaafb5d4192 req-b069ab92-17cb-4262-8274-5fb913b3daea 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.341 186985 DEBUG oslo_concurrency.lockutils [req-e86b8a5e-6919-420c-bcf8-1eaafb5d4192 req-b069ab92-17cb-4262-8274-5fb913b3daea 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.341 186985 DEBUG nova.compute.manager [req-e86b8a5e-6919-420c-bcf8-1eaafb5d4192 req-b069ab92-17cb-4262-8274-5fb913b3daea 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Processing event network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.347 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b54a16-d01b-4a88-bb78-ea131d9b6fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.349 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc93da617-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.350 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.351 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc93da617-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.352 186985 DEBUG nova.network.neutron [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updated VIF entry in instance network info cache for port c6c9c6b8-8279-46c9-839c-b2f5011e57b6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.352 186985 DEBUG nova.network.neutron [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updating instance_info_cache with network_info: [{"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:13:00 compute-0 NetworkManager[55425]: <info>  [1763806380.3541] manager: (tapc93da617-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 22 10:13:00 compute-0 kernel: tapc93da617-80: entered promiscuous mode
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.353 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.356 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc93da617-80, col_values=(('external_ids', {'iface-id': '69c5b53f-1522-47b6-9401-9d1e1825a5fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.357 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:00 compute-0 ovn_controller[95329]: 2025-11-22T10:13:00Z|00168|binding|INFO|Releasing lport 69c5b53f-1522-47b6-9401-9d1e1825a5fc from this chassis (sb_readonly=0)
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.358 104216 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c93da617-8dc2-443c-a250-32e835de0d91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c93da617-8dc2-443c-a250-32e835de0d91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.360 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[8969c28d-7f16-47c3-92af-5f2de8140f17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.361 104216 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: global
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     log         /dev/log local0 debug
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     log-tag     haproxy-metadata-proxy-c93da617-8dc2-443c-a250-32e835de0d91
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     user        root
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     group       root
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     maxconn     1024
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     pidfile     /var/lib/neutron/external/pids/c93da617-8dc2-443c-a250-32e835de0d91.pid.haproxy
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     daemon
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: defaults
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     log global
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     mode http
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     option httplog
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     option dontlognull
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     option http-server-close
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     option forwardfor
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     retries                 3
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     timeout http-request    30s
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     timeout connect         30s
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     timeout client          32s
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     timeout server          32s
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     timeout http-keep-alive 30s
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: listen listener
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     bind 169.254.169.254:80
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:     http-request add-header X-OVN-Network-ID c93da617-8dc2-443c-a250-32e835de0d91
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 10:13:00 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:00.362 104216 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91', 'env', 'PROCESS_TAG=haproxy-c93da617-8dc2-443c-a250-32e835de0d91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c93da617-8dc2-443c-a250-32e835de0d91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.368 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.371 186985 DEBUG oslo_concurrency.lockutils [req-a2bfd5f6-b9a1-4838-9288-fa1edce14101 req-e674303f-a9f2-4b74-a60f-74cdd9d45935 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.573 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.574 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806380.5728498, ea3ef3d3-b413-4626-b3a0-e09ab809e661 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.575 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] VM Started (Lifecycle Event)
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.581 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.584 186985 INFO nova.virt.libvirt.driver [-] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Instance spawned successfully.
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.584 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.600 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.606 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.609 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.609 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.610 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.610 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.611 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.611 186985 DEBUG nova.virt.libvirt.driver [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.634 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.635 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806380.573059, ea3ef3d3-b413-4626-b3a0-e09ab809e661 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.635 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] VM Paused (Lifecycle Event)
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.661 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.665 186985 DEBUG nova.virt.driver [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] Emitting event <LifecycleEvent: 1763806380.5797293, ea3ef3d3-b413-4626-b3a0-e09ab809e661 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.665 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] VM Resumed (Lifecycle Event)
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.673 186985 INFO nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Took 4.74 seconds to spawn the instance on the hypervisor.
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.673 186985 DEBUG nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.684 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.688 186985 DEBUG nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.713 186985 INFO nova.compute.manager [None req-e9667edb-e53e-4a3c-9144-295b13fee5f9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.736 186985 INFO nova.compute.manager [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Took 5.16 seconds to build instance.
Nov 22 10:13:00 compute-0 nova_compute[186981]: 2025-11-22 10:13:00.751 186985 DEBUG oslo_concurrency.lockutils [None req-8f2bc9a4-bb86-4a2e-88bb-dec0437979a6 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:00 compute-0 podman[219387]: 2025-11-22 10:13:00.816989206 +0000 UTC m=+0.069253312 container create 1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 10:13:00 compute-0 systemd[1]: Started libpod-conmon-1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5.scope.
Nov 22 10:13:00 compute-0 podman[219387]: 2025-11-22 10:13:00.774980795 +0000 UTC m=+0.027244951 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 10:13:00 compute-0 systemd[1]: Started libcrun container.
Nov 22 10:13:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31492ba71860e2d2647a37e70fea088bc935f78fafc1816ad99a8033cdad9722/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 10:13:00 compute-0 podman[219387]: 2025-11-22 10:13:00.914487213 +0000 UTC m=+0.166751369 container init 1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 10:13:00 compute-0 podman[219387]: 2025-11-22 10:13:00.92760174 +0000 UTC m=+0.179865846 container start 1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:13:00 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [NOTICE]   (219407) : New worker (219409) forked
Nov 22 10:13:00 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [NOTICE]   (219407) : Loading success.
Nov 22 10:13:01 compute-0 nova_compute[186981]: 2025-11-22 10:13:01.212 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:02 compute-0 nova_compute[186981]: 2025-11-22 10:13:02.412 186985 DEBUG nova.compute.manager [req-d699934f-74fb-48ab-9fb5-84cdc44d86db req-8be3154e-5cbf-41d4-991f-b4c42845e833 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:13:02 compute-0 nova_compute[186981]: 2025-11-22 10:13:02.413 186985 DEBUG oslo_concurrency.lockutils [req-d699934f-74fb-48ab-9fb5-84cdc44d86db req-8be3154e-5cbf-41d4-991f-b4c42845e833 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:02 compute-0 nova_compute[186981]: 2025-11-22 10:13:02.413 186985 DEBUG oslo_concurrency.lockutils [req-d699934f-74fb-48ab-9fb5-84cdc44d86db req-8be3154e-5cbf-41d4-991f-b4c42845e833 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:02 compute-0 nova_compute[186981]: 2025-11-22 10:13:02.413 186985 DEBUG oslo_concurrency.lockutils [req-d699934f-74fb-48ab-9fb5-84cdc44d86db req-8be3154e-5cbf-41d4-991f-b4c42845e833 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:02 compute-0 nova_compute[186981]: 2025-11-22 10:13:02.413 186985 DEBUG nova.compute.manager [req-d699934f-74fb-48ab-9fb5-84cdc44d86db req-8be3154e-5cbf-41d4-991f-b4c42845e833 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] No waiting events found dispatching network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:13:02 compute-0 nova_compute[186981]: 2025-11-22 10:13:02.414 186985 WARNING nova.compute.manager [req-d699934f-74fb-48ab-9fb5-84cdc44d86db req-8be3154e-5cbf-41d4-991f-b4c42845e833 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received unexpected event network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 for instance with vm_state active and task_state None.
Nov 22 10:13:03 compute-0 ovn_controller[95329]: 2025-11-22T10:13:03Z|00169|binding|INFO|Releasing lport 69c5b53f-1522-47b6-9401-9d1e1825a5fc from this chassis (sb_readonly=0)
Nov 22 10:13:03 compute-0 nova_compute[186981]: 2025-11-22 10:13:03.773 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:03 compute-0 NetworkManager[55425]: <info>  [1763806383.7746] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Nov 22 10:13:03 compute-0 NetworkManager[55425]: <info>  [1763806383.7764] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 22 10:13:03 compute-0 ovn_controller[95329]: 2025-11-22T10:13:03Z|00170|binding|INFO|Releasing lport 69c5b53f-1522-47b6-9401-9d1e1825a5fc from this chassis (sb_readonly=0)
Nov 22 10:13:03 compute-0 nova_compute[186981]: 2025-11-22 10:13:03.827 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:03 compute-0 nova_compute[186981]: 2025-11-22 10:13:03.838 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:04 compute-0 nova_compute[186981]: 2025-11-22 10:13:04.062 186985 DEBUG nova.compute.manager [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-changed-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:13:04 compute-0 nova_compute[186981]: 2025-11-22 10:13:04.063 186985 DEBUG nova.compute.manager [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Refreshing instance network info cache due to event network-changed-c6c9c6b8-8279-46c9-839c-b2f5011e57b6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:13:04 compute-0 nova_compute[186981]: 2025-11-22 10:13:04.063 186985 DEBUG oslo_concurrency.lockutils [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:13:04 compute-0 nova_compute[186981]: 2025-11-22 10:13:04.064 186985 DEBUG oslo_concurrency.lockutils [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:13:04 compute-0 nova_compute[186981]: 2025-11-22 10:13:04.065 186985 DEBUG nova.network.neutron [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Refreshing network info cache for port c6c9c6b8-8279-46c9-839c-b2f5011e57b6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:13:04 compute-0 nova_compute[186981]: 2025-11-22 10:13:04.341 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:05 compute-0 nova_compute[186981]: 2025-11-22 10:13:05.336 186985 DEBUG nova.network.neutron [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updated VIF entry in instance network info cache for port c6c9c6b8-8279-46c9-839c-b2f5011e57b6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:13:05 compute-0 nova_compute[186981]: 2025-11-22 10:13:05.337 186985 DEBUG nova.network.neutron [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updating instance_info_cache with network_info: [{"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:13:05 compute-0 nova_compute[186981]: 2025-11-22 10:13:05.356 186985 DEBUG oslo_concurrency.lockutils [req-68456af2-6b60-4e06-b9df-ec841eccd47e req-f76c0b0b-2946-4953-9ba1-48159ade5ea6 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:13:06 compute-0 nova_compute[186981]: 2025-11-22 10:13:06.263 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:09 compute-0 nova_compute[186981]: 2025-11-22 10:13:09.345 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:11 compute-0 nova_compute[186981]: 2025-11-22 10:13:11.264 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:11 compute-0 podman[219423]: 2025-11-22 10:13:11.621042115 +0000 UTC m=+0.067010691 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 10:13:11 compute-0 podman[219424]: 2025-11-22 10:13:11.683333297 +0000 UTC m=+0.117314517 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:13:13 compute-0 ovn_controller[95329]: 2025-11-22T10:13:13Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dc:0b:cf 10.100.0.13
Nov 22 10:13:13 compute-0 ovn_controller[95329]: 2025-11-22T10:13:13Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dc:0b:cf 10.100.0.13
Nov 22 10:13:14 compute-0 nova_compute[186981]: 2025-11-22 10:13:14.347 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:16 compute-0 nova_compute[186981]: 2025-11-22 10:13:16.305 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:17.943 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:17.944 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:17.945 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.615 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.615 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:13:18 compute-0 podman[219476]: 2025-11-22 10:13:18.628594548 +0000 UTC m=+0.071905064 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.640 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.641 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.641 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.641 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:18 compute-0 podman[219475]: 2025-11-22 10:13:18.643603256 +0000 UTC m=+0.087820406 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.673 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.673 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.674 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.674 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.749 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.803 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.804 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:13:18 compute-0 nova_compute[186981]: 2025-11-22 10:13:18.858 186985 DEBUG oslo_concurrency.processutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.027 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.029 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5578MB free_disk=73.43006896972656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.029 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.030 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.133 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Instance ea3ef3d3-b413-4626-b3a0-e09ab809e661 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.134 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.134 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.171 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing inventories for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.204 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating ProviderTree inventory for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.205 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.221 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing aggregate associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.244 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing trait associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE2,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE4A,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.288 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.317 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.344 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.344 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:19 compute-0 nova_compute[186981]: 2025-11-22 10:13:19.351 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:20 compute-0 nova_compute[186981]: 2025-11-22 10:13:20.298 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:20 compute-0 nova_compute[186981]: 2025-11-22 10:13:20.581 186985 INFO nova.compute.manager [None req-d453aab3-1972-46fb-a458-721649a045c3 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Get console output
Nov 22 10:13:20 compute-0 nova_compute[186981]: 2025-11-22 10:13:20.587 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:13:20 compute-0 podman[219522]: 2025-11-22 10:13:20.608378655 +0000 UTC m=+0.065947821 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 10:13:20 compute-0 podman[219523]: 2025-11-22 10:13:20.617136844 +0000 UTC m=+0.071386000 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 10:13:21 compute-0 nova_compute[186981]: 2025-11-22 10:13:21.306 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:21 compute-0 nova_compute[186981]: 2025-11-22 10:13:21.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:21 compute-0 nova_compute[186981]: 2025-11-22 10:13:21.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:21 compute-0 nova_compute[186981]: 2025-11-22 10:13:21.593 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:13:21 compute-0 ovn_controller[95329]: 2025-11-22T10:13:21Z|00171|binding|INFO|Releasing lport 69c5b53f-1522-47b6-9401-9d1e1825a5fc from this chassis (sb_readonly=0)
Nov 22 10:13:21 compute-0 nova_compute[186981]: 2025-11-22 10:13:21.775 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:21 compute-0 ovn_controller[95329]: 2025-11-22T10:13:21Z|00172|binding|INFO|Releasing lport 69c5b53f-1522-47b6-9401-9d1e1825a5fc from this chassis (sb_readonly=0)
Nov 22 10:13:21 compute-0 nova_compute[186981]: 2025-11-22 10:13:21.862 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:22 compute-0 nova_compute[186981]: 2025-11-22 10:13:22.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:23 compute-0 nova_compute[186981]: 2025-11-22 10:13:23.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:24 compute-0 nova_compute[186981]: 2025-11-22 10:13:24.048 186985 INFO nova.compute.manager [None req-71fe6761-3954-4b0c-8072-12a5dcaa97a1 fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Get console output
Nov 22 10:13:24 compute-0 nova_compute[186981]: 2025-11-22 10:13:24.054 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:13:24 compute-0 nova_compute[186981]: 2025-11-22 10:13:24.354 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:24 compute-0 nova_compute[186981]: 2025-11-22 10:13:24.588 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:13:26 compute-0 nova_compute[186981]: 2025-11-22 10:13:26.352 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:28 compute-0 nova_compute[186981]: 2025-11-22 10:13:28.280 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:28 compute-0 NetworkManager[55425]: <info>  [1763806408.2818] manager: (patch-br-int-to-provnet-4019b385-7026-46d5-9fc6-69b4037cce96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 22 10:13:28 compute-0 NetworkManager[55425]: <info>  [1763806408.2836] manager: (patch-provnet-4019b385-7026-46d5-9fc6-69b4037cce96-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Nov 22 10:13:28 compute-0 nova_compute[186981]: 2025-11-22 10:13:28.323 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:28 compute-0 ovn_controller[95329]: 2025-11-22T10:13:28Z|00173|binding|INFO|Releasing lport 69c5b53f-1522-47b6-9401-9d1e1825a5fc from this chassis (sb_readonly=0)
Nov 22 10:13:28 compute-0 nova_compute[186981]: 2025-11-22 10:13:28.332 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:29 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:29.355 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:13:29 compute-0 nova_compute[186981]: 2025-11-22 10:13:29.355 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:29 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:29.356 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:13:29 compute-0 nova_compute[186981]: 2025-11-22 10:13:29.406 186985 INFO nova.compute.manager [None req-2f501375-6ace-4fbb-be2f-79d0099750ea fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Get console output
Nov 22 10:13:29 compute-0 nova_compute[186981]: 2025-11-22 10:13:29.414 213374 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 10:13:30 compute-0 podman[219567]: 2025-11-22 10:13:30.608518332 +0000 UTC m=+0.061296046 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.354 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.775 186985 DEBUG nova.compute.manager [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-changed-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.775 186985 DEBUG nova.compute.manager [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Refreshing instance network info cache due to event network-changed-c6c9c6b8-8279-46c9-839c-b2f5011e57b6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.775 186985 DEBUG oslo_concurrency.lockutils [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.775 186985 DEBUG oslo_concurrency.lockutils [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquired lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.775 186985 DEBUG nova.network.neutron [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Refreshing network info cache for port c6c9c6b8-8279-46c9-839c-b2f5011e57b6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.866 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.867 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.867 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.868 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.868 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.869 186985 INFO nova.compute.manager [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Terminating instance
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.870 186985 DEBUG nova.compute.manager [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 10:13:31 compute-0 kernel: tapc6c9c6b8-82 (unregistering): left promiscuous mode
Nov 22 10:13:31 compute-0 NetworkManager[55425]: <info>  [1763806411.8916] device (tapc6c9c6b8-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 10:13:31 compute-0 ovn_controller[95329]: 2025-11-22T10:13:31Z|00174|binding|INFO|Releasing lport c6c9c6b8-8279-46c9-839c-b2f5011e57b6 from this chassis (sb_readonly=0)
Nov 22 10:13:31 compute-0 ovn_controller[95329]: 2025-11-22T10:13:31Z|00175|binding|INFO|Setting lport c6c9c6b8-8279-46c9-839c-b2f5011e57b6 down in Southbound
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.901 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:31 compute-0 ovn_controller[95329]: 2025-11-22T10:13:31Z|00176|binding|INFO|Removing iface tapc6c9c6b8-82 ovn-installed in OVS
Nov 22 10:13:31 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:31.911 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:0b:cf 10.100.0.13'], port_security=['fa:16:3e:dc:0b:cf 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ea3ef3d3-b413-4626-b3a0-e09ab809e661', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c93da617-8dc2-443c-a250-32e835de0d91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b60c6181ec1c449ab3dd7a45969909f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '823337f9-8a53-498d-a34e-9c6a37348cb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdd87b40-dcfd-41dd-a2f3-26f18606e8b9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>], logical_port=c6c9c6b8-8279-46c9-839c-b2f5011e57b6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f66492176a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:13:31 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:31.913 104216 INFO neutron.agent.ovn.metadata.agent [-] Port c6c9c6b8-8279-46c9-839c-b2f5011e57b6 in datapath c93da617-8dc2-443c-a250-32e835de0d91 unbound from our chassis
Nov 22 10:13:31 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:31.915 104216 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c93da617-8dc2-443c-a250-32e835de0d91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 10:13:31 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:31.917 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3d0975-06ce-4e9e-b64c-d87293640815]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:31 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:31.918 104216 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91 namespace which is not needed anymore
Nov 22 10:13:31 compute-0 nova_compute[186981]: 2025-11-22 10:13:31.933 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:31 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 22 10:13:31 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 13.374s CPU time.
Nov 22 10:13:31 compute-0 systemd-machined[153303]: Machine qemu-13-instance-0000000d terminated.
Nov 22 10:13:32 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [NOTICE]   (219407) : haproxy version is 2.8.14-c23fe91
Nov 22 10:13:32 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [NOTICE]   (219407) : path to executable is /usr/sbin/haproxy
Nov 22 10:13:32 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [WARNING]  (219407) : Exiting Master process...
Nov 22 10:13:32 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [WARNING]  (219407) : Exiting Master process...
Nov 22 10:13:32 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [ALERT]    (219407) : Current worker (219409) exited with code 143 (Terminated)
Nov 22 10:13:32 compute-0 neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91[219403]: [WARNING]  (219407) : All workers exited. Exiting... (0)
Nov 22 10:13:32 compute-0 systemd[1]: libpod-1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5.scope: Deactivated successfully.
Nov 22 10:13:32 compute-0 podman[219615]: 2025-11-22 10:13:32.087944831 +0000 UTC m=+0.050504003 container died 1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:13:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5-userdata-shm.mount: Deactivated successfully.
Nov 22 10:13:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-31492ba71860e2d2647a37e70fea088bc935f78fafc1816ad99a8033cdad9722-merged.mount: Deactivated successfully.
Nov 22 10:13:32 compute-0 podman[219615]: 2025-11-22 10:13:32.12730546 +0000 UTC m=+0.089864632 container cleanup 1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 10:13:32 compute-0 systemd[1]: libpod-conmon-1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5.scope: Deactivated successfully.
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.145 186985 DEBUG nova.compute.manager [req-408ee186-5c84-4b84-8bb3-c20cd98eccdc req-f6a10195-0d8d-4c5c-86a0-0627f90c2b4f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-vif-unplugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.145 186985 DEBUG oslo_concurrency.lockutils [req-408ee186-5c84-4b84-8bb3-c20cd98eccdc req-f6a10195-0d8d-4c5c-86a0-0627f90c2b4f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.146 186985 DEBUG oslo_concurrency.lockutils [req-408ee186-5c84-4b84-8bb3-c20cd98eccdc req-f6a10195-0d8d-4c5c-86a0-0627f90c2b4f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.146 186985 DEBUG oslo_concurrency.lockutils [req-408ee186-5c84-4b84-8bb3-c20cd98eccdc req-f6a10195-0d8d-4c5c-86a0-0627f90c2b4f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.146 186985 DEBUG nova.compute.manager [req-408ee186-5c84-4b84-8bb3-c20cd98eccdc req-f6a10195-0d8d-4c5c-86a0-0627f90c2b4f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] No waiting events found dispatching network-vif-unplugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.146 186985 DEBUG nova.compute.manager [req-408ee186-5c84-4b84-8bb3-c20cd98eccdc req-f6a10195-0d8d-4c5c-86a0-0627f90c2b4f 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-vif-unplugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.148 186985 INFO nova.virt.libvirt.driver [-] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Instance destroyed successfully.
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.148 186985 DEBUG nova.objects.instance [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lazy-loading 'resources' on Instance uuid ea3ef3d3-b413-4626-b3a0-e09ab809e661 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.170 186985 DEBUG nova.virt.libvirt.vif [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T10:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1539712246',display_name='tempest-TestNetworkBasicOps-server-1539712246',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1539712246',id=13,image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHtIvCFeVeOxcibzAEUh72Jq9e/4HpHFuC/6IQ8LbkPTGw+dLbWIIBBlwRl5UfcMu3FHMK5uAhqGuizqm/iGctBci2blu6IIlI53gtX+cn4RvAYWnDlzL3TyCMpHp42lw==',key_name='tempest-TestNetworkBasicOps-399070402',keypairs=<?>,launch_index=0,launched_at=2025-11-22T10:13:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b60c6181ec1c449ab3dd7a45969909f7',ramdisk_id='',reservation_id='r-e0et608g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7f933537-dfd2-407d-a523-ec45187c75fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-171376730',owner_user_name='tempest-TestNetworkBasicOps-171376730-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T10:13:00Z,user_data=None,user_id='fd88a700663e44618f0a22f234573806',uuid=ea3ef3d3-b413-4626-b3a0-e09ab809e661,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.170 186985 DEBUG nova.network.os_vif_util [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converting VIF {"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.171 186985 DEBUG nova.network.os_vif_util [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:0b:cf,bridge_name='br-int',has_traffic_filtering=True,id=c6c9c6b8-8279-46c9-839c-b2f5011e57b6,network=Network(c93da617-8dc2-443c-a250-32e835de0d91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c9c6b8-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.171 186985 DEBUG os_vif [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:0b:cf,bridge_name='br-int',has_traffic_filtering=True,id=c6c9c6b8-8279-46c9-839c-b2f5011e57b6,network=Network(c93da617-8dc2-443c-a250-32e835de0d91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c9c6b8-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.175 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.176 186985 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6c9c6b8-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.180 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.181 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.183 186985 INFO os_vif [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:0b:cf,bridge_name='br-int',has_traffic_filtering=True,id=c6c9c6b8-8279-46c9-839c-b2f5011e57b6,network=Network(c93da617-8dc2-443c-a250-32e835de0d91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c9c6b8-82')
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.184 186985 INFO nova.virt.libvirt.driver [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Deleting instance files /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661_del
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.185 186985 INFO nova.virt.libvirt.driver [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Deletion of /var/lib/nova/instances/ea3ef3d3-b413-4626-b3a0-e09ab809e661_del complete
Nov 22 10:13:32 compute-0 podman[219660]: 2025-11-22 10:13:32.196228022 +0000 UTC m=+0.041006105 container remove 1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.200 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[79a64e81-ab99-455b-aee8-657dc6665329]: (4, ('Sat Nov 22 10:13:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91 (1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5)\n1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5\nSat Nov 22 10:13:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91 (1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5)\n1a50321920b14f57e9e379df26bad4948bb0ef1cd65b6c1fdd7f783d5700b4b5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.202 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[2963f5eb-bbd4-4783-a538-e512ef4fd5ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.203 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc93da617-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.205 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:32 compute-0 kernel: tapc93da617-80: left promiscuous mode
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.215 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.219 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.220 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9f7f30-473e-4fe3-aa4f-e1c15cbd2c47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.238 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[5dcd930f-248c-40b3-985a-d21478580077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.239 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[bd41c6e3-05dc-416c-9b6d-62f4e609d370]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.253 213484 DEBUG oslo.privsep.daemon [-] privsep: reply[a39a4c40-3aad-4422-b6d0-713ec12c7f79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383784, 'reachable_time': 23894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219675, 'error': None, 'target': 'ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:32 compute-0 systemd[1]: run-netns-ovnmeta\x2dc93da617\x2d8dc2\x2d443c\x2da250\x2d32e835de0d91.mount: Deactivated successfully.
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.258 104329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c93da617-8dc2-443c-a250-32e835de0d91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 10:13:32 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:32.258 104329 DEBUG oslo.privsep.daemon [-] privsep: reply[a429b542-a508-4ec7-b5be-e18266cf5f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.270 186985 INFO nova.compute.manager [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.271 186985 DEBUG oslo.service.loopingcall [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.272 186985 DEBUG nova.compute.manager [-] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 10:13:32 compute-0 nova_compute[186981]: 2025-11-22 10:13:32.272 186985 DEBUG nova.network.neutron [-] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.561 186985 DEBUG nova.network.neutron [-] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.582 186985 INFO nova.compute.manager [-] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Took 1.31 seconds to deallocate network for instance.
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.630 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.630 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.698 186985 DEBUG nova.compute.provider_tree [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.711 186985 DEBUG nova.scheduler.client.report [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.729 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.764 186985 INFO nova.scheduler.client.report [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Deleted allocations for instance ea3ef3d3-b413-4626-b3a0-e09ab809e661
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.859 186985 DEBUG oslo_concurrency.lockutils [None req-eeb3f546-6660-4e80-a850-224565fd9b6a fd88a700663e44618f0a22f234573806 b60c6181ec1c449ab3dd7a45969909f7 - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:33 compute-0 nova_compute[186981]: 2025-11-22 10:13:33.918 186985 DEBUG nova.compute.manager [req-29654ea8-ae60-4c29-ac71-bef5cff0b252 req-5ed2a809-3c63-4e84-8e8b-30283ab1964c 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-vif-deleted-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.004 186985 DEBUG nova.network.neutron [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updated VIF entry in instance network info cache for port c6c9c6b8-8279-46c9-839c-b2f5011e57b6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.005 186985 DEBUG nova.network.neutron [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Updating instance_info_cache with network_info: [{"id": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "address": "fa:16:3e:dc:0b:cf", "network": {"id": "c93da617-8dc2-443c-a250-32e835de0d91", "bridge": "br-int", "label": "tempest-network-smoke--408892261", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b60c6181ec1c449ab3dd7a45969909f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c9c6b8-82", "ovs_interfaceid": "c6c9c6b8-8279-46c9-839c-b2f5011e57b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.035 186985 DEBUG oslo_concurrency.lockutils [req-7c25c956-8899-4f64-a26d-7f7d80374f71 req-e36ce158-b92c-4201-a64d-947daba839ad 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Releasing lock "refresh_cache-ea3ef3d3-b413-4626-b3a0-e09ab809e661" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.240 186985 DEBUG nova.compute.manager [req-667af77b-af7b-4dba-84b0-73d392f28022 req-13ed1781-dcc4-4f60-a03c-49c9ae1b6fbc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received event network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.241 186985 DEBUG oslo_concurrency.lockutils [req-667af77b-af7b-4dba-84b0-73d392f28022 req-13ed1781-dcc4-4f60-a03c-49c9ae1b6fbc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Acquiring lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.241 186985 DEBUG oslo_concurrency.lockutils [req-667af77b-af7b-4dba-84b0-73d392f28022 req-13ed1781-dcc4-4f60-a03c-49c9ae1b6fbc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.241 186985 DEBUG oslo_concurrency.lockutils [req-667af77b-af7b-4dba-84b0-73d392f28022 req-13ed1781-dcc4-4f60-a03c-49c9ae1b6fbc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] Lock "ea3ef3d3-b413-4626-b3a0-e09ab809e661-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.241 186985 DEBUG nova.compute.manager [req-667af77b-af7b-4dba-84b0-73d392f28022 req-13ed1781-dcc4-4f60-a03c-49c9ae1b6fbc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] No waiting events found dispatching network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 10:13:34 compute-0 nova_compute[186981]: 2025-11-22 10:13:34.242 186985 WARNING nova.compute.manager [req-667af77b-af7b-4dba-84b0-73d392f28022 req-13ed1781-dcc4-4f60-a03c-49c9ae1b6fbc 1c21415e38b14949bd1b745184db5699 a30e6a2d43644fc7a60baaf378993f7b - - default default] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Received unexpected event network-vif-plugged-c6c9c6b8-8279-46c9-839c-b2f5011e57b6 for instance with vm_state deleted and task_state None.
Nov 22 10:13:36 compute-0 nova_compute[186981]: 2025-11-22 10:13:36.409 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:37 compute-0 nova_compute[186981]: 2025-11-22 10:13:37.178 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:37 compute-0 nova_compute[186981]: 2025-11-22 10:13:37.304 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:37 compute-0 nova_compute[186981]: 2025-11-22 10:13:37.395 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:39 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:13:39.358 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:13:41 compute-0 nova_compute[186981]: 2025-11-22 10:13:41.409 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:42 compute-0 nova_compute[186981]: 2025-11-22 10:13:42.181 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:42 compute-0 podman[219677]: 2025-11-22 10:13:42.598994432 +0000 UTC m=+0.053629077 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 10:13:42 compute-0 podman[219678]: 2025-11-22 10:13:42.655870058 +0000 UTC m=+0.098930149 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 10:13:46 compute-0 nova_compute[186981]: 2025-11-22 10:13:46.410 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:47 compute-0 nova_compute[186981]: 2025-11-22 10:13:47.141 186985 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763806412.1400712, ea3ef3d3-b413-4626-b3a0-e09ab809e661 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 10:13:47 compute-0 nova_compute[186981]: 2025-11-22 10:13:47.142 186985 INFO nova.compute.manager [-] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] VM Stopped (Lifecycle Event)
Nov 22 10:13:47 compute-0 nova_compute[186981]: 2025-11-22 10:13:47.167 186985 DEBUG nova.compute.manager [None req-efb3ed8f-1d80-4c46-bb06-e810197f9fd9 - - - - - -] [instance: ea3ef3d3-b413-4626-b3a0-e09ab809e661] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 10:13:47 compute-0 nova_compute[186981]: 2025-11-22 10:13:47.183 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:49 compute-0 podman[219723]: 2025-11-22 10:13:49.638573586 +0000 UTC m=+0.064489243 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 22 10:13:49 compute-0 podman[219722]: 2025-11-22 10:13:49.64645596 +0000 UTC m=+0.103897453 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:13:51 compute-0 nova_compute[186981]: 2025-11-22 10:13:51.442 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:51 compute-0 podman[219764]: 2025-11-22 10:13:51.629372083 +0000 UTC m=+0.078334028 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 10:13:51 compute-0 podman[219765]: 2025-11-22 10:13:51.661909747 +0000 UTC m=+0.096753889 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:13:52 compute-0 nova_compute[186981]: 2025-11-22 10:13:52.186 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:56 compute-0 nova_compute[186981]: 2025-11-22 10:13:56.443 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:13:57 compute-0 nova_compute[186981]: 2025-11-22 10:13:57.196 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:01 compute-0 nova_compute[186981]: 2025-11-22 10:14:01.444 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:01 compute-0 podman[219807]: 2025-11-22 10:14:01.614262904 +0000 UTC m=+0.058918371 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:14:02 compute-0 nova_compute[186981]: 2025-11-22 10:14:02.198 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:06 compute-0 nova_compute[186981]: 2025-11-22 10:14:06.482 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:07 compute-0 nova_compute[186981]: 2025-11-22 10:14:07.200 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:07 compute-0 ovn_controller[95329]: 2025-11-22T10:14:07Z|00177|memory_trim|INFO|Detected inactivity (last active 30026 ms ago): trimming memory
Nov 22 10:14:11 compute-0 nova_compute[186981]: 2025-11-22 10:14:11.538 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:12 compute-0 nova_compute[186981]: 2025-11-22 10:14:12.202 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:13 compute-0 podman[219833]: 2025-11-22 10:14:13.627494283 +0000 UTC m=+0.086088369 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:14:13 compute-0 podman[219834]: 2025-11-22 10:14:13.663306425 +0000 UTC m=+0.119038274 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:14:16 compute-0 nova_compute[186981]: 2025-11-22 10:14:16.540 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:14:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:14:17 compute-0 nova_compute[186981]: 2025-11-22 10:14:17.233 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:14:17.944 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:14:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:14:17.944 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:14:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:14:17.945 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:14:18 compute-0 nova_compute[186981]: 2025-11-22 10:14:18.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:14:20 compute-0 podman[219877]: 2025-11-22 10:14:20.631426049 +0000 UTC m=+0.075510642 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.632 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.632 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.632 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:20 compute-0 podman[219876]: 2025-11-22 10:14:20.643410384 +0000 UTC m=+0.091337312 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.656 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.656 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.657 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.657 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.820 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.821 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5762MB free_disk=73.4587516784668GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.821 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.821 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.879 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.879 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.902 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:14:20 compute-0 nova_compute[186981]: 2025-11-22 10:14:20.915 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:14:21 compute-0 nova_compute[186981]: 2025-11-22 10:14:21.078 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:14:21 compute-0 nova_compute[186981]: 2025-11-22 10:14:21.079 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:14:21 compute-0 nova_compute[186981]: 2025-11-22 10:14:21.543 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:22 compute-0 nova_compute[186981]: 2025-11-22 10:14:22.041 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:22 compute-0 nova_compute[186981]: 2025-11-22 10:14:22.041 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:22 compute-0 nova_compute[186981]: 2025-11-22 10:14:22.042 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:14:22 compute-0 nova_compute[186981]: 2025-11-22 10:14:22.235 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:22 compute-0 nova_compute[186981]: 2025-11-22 10:14:22.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:22 compute-0 podman[219916]: 2025-11-22 10:14:22.607644619 +0000 UTC m=+0.063452145 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:14:22 compute-0 podman[219917]: 2025-11-22 10:14:22.620500108 +0000 UTC m=+0.072193752 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:14:23 compute-0 nova_compute[186981]: 2025-11-22 10:14:23.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:24 compute-0 nova_compute[186981]: 2025-11-22 10:14:24.588 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:14:24 compute-0 sshd-session[219958]: Accepted publickey for zuul from 192.168.122.10 port 41908 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 10:14:24 compute-0 systemd-logind[819]: New session 26 of user zuul.
Nov 22 10:14:24 compute-0 systemd[1]: Started Session 26 of User zuul.
Nov 22 10:14:24 compute-0 sshd-session[219958]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 10:14:25 compute-0 sudo[219962]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 22 10:14:25 compute-0 sudo[219962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 10:14:26 compute-0 nova_compute[186981]: 2025-11-22 10:14:26.582 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:27 compute-0 nova_compute[186981]: 2025-11-22 10:14:27.237 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:30 compute-0 ovs-vsctl[220137]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 10:14:31 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 219986 (sos)
Nov 22 10:14:31 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 22 10:14:31 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 22 10:14:31 compute-0 virtqemud[186556]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 10:14:31 compute-0 virtqemud[186556]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 10:14:31 compute-0 virtqemud[186556]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 10:14:31 compute-0 nova_compute[186981]: 2025-11-22 10:14:31.622 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:31 compute-0 podman[220337]: 2025-11-22 10:14:31.876504245 +0000 UTC m=+0.053524644 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:14:32 compute-0 nova_compute[186981]: 2025-11-22 10:14:32.238 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:32 compute-0 crontab[220578]: (root) LIST (root)
Nov 22 10:14:34 compute-0 systemd[1]: Starting Hostname Service...
Nov 22 10:14:34 compute-0 systemd[1]: Started Hostname Service.
Nov 22 10:14:36 compute-0 nova_compute[186981]: 2025-11-22 10:14:36.623 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:37 compute-0 nova_compute[186981]: 2025-11-22 10:14:37.240 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:41 compute-0 ovs-appctl[221727]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 10:14:41 compute-0 ovs-appctl[221736]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 10:14:41 compute-0 ovs-appctl[221742]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 10:14:41 compute-0 nova_compute[186981]: 2025-11-22 10:14:41.625 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:42 compute-0 nova_compute[186981]: 2025-11-22 10:14:42.241 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:43 compute-0 podman[222649]: 2025-11-22 10:14:43.7997701 +0000 UTC m=+0.064668242 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 10:14:43 compute-0 podman[222651]: 2025-11-22 10:14:43.83098793 +0000 UTC m=+0.093635000 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 10:14:46 compute-0 nova_compute[186981]: 2025-11-22 10:14:46.681 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:47 compute-0 nova_compute[186981]: 2025-11-22 10:14:47.243 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:48 compute-0 virtqemud[186556]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 10:14:49 compute-0 systemd[1]: Starting Time & Date Service...
Nov 22 10:14:49 compute-0 systemd[1]: Started Time & Date Service.
Nov 22 10:14:51 compute-0 podman[223314]: 2025-11-22 10:14:51.624945875 +0000 UTC m=+0.079910866 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:14:51 compute-0 podman[223313]: 2025-11-22 10:14:51.643728887 +0000 UTC m=+0.092505980 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 10:14:51 compute-0 nova_compute[186981]: 2025-11-22 10:14:51.740 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:52 compute-0 nova_compute[186981]: 2025-11-22 10:14:52.246 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:53 compute-0 podman[223353]: 2025-11-22 10:14:53.632790841 +0000 UTC m=+0.068191498 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:14:53 compute-0 podman[223354]: 2025-11-22 10:14:53.657390901 +0000 UTC m=+0.081432868 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 10:14:56 compute-0 nova_compute[186981]: 2025-11-22 10:14:56.746 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:14:57 compute-0 nova_compute[186981]: 2025-11-22 10:14:57.249 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:01 compute-0 nova_compute[186981]: 2025-11-22 10:15:01.747 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:02 compute-0 nova_compute[186981]: 2025-11-22 10:15:02.252 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:02 compute-0 podman[223397]: 2025-11-22 10:15:02.613376859 +0000 UTC m=+0.070586773 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:15:06 compute-0 nova_compute[186981]: 2025-11-22 10:15:06.749 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:07 compute-0 nova_compute[186981]: 2025-11-22 10:15:07.254 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:09 compute-0 sudo[219962]: pam_unix(sudo:session): session closed for user root
Nov 22 10:15:09 compute-0 sshd-session[219961]: Received disconnect from 192.168.122.10 port 41908:11: disconnected by user
Nov 22 10:15:09 compute-0 sshd-session[219961]: Disconnected from user zuul 192.168.122.10 port 41908
Nov 22 10:15:09 compute-0 sshd-session[219958]: pam_unix(sshd:session): session closed for user zuul
Nov 22 10:15:09 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 22 10:15:09 compute-0 systemd[1]: session-26.scope: Consumed 1min 13.519s CPU time, 487.3M memory peak, read 100.9M from disk, written 23.1M to disk.
Nov 22 10:15:09 compute-0 systemd-logind[819]: Session 26 logged out. Waiting for processes to exit.
Nov 22 10:15:09 compute-0 systemd-logind[819]: Removed session 26.
Nov 22 10:15:09 compute-0 sshd-session[223422]: Accepted publickey for zuul from 192.168.122.10 port 37412 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 10:15:09 compute-0 systemd-logind[819]: New session 27 of user zuul.
Nov 22 10:15:09 compute-0 systemd[1]: Started Session 27 of User zuul.
Nov 22 10:15:09 compute-0 sshd-session[223422]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 10:15:10 compute-0 sudo[223426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-11-22-uikbnkh.tar.xz
Nov 22 10:15:10 compute-0 sudo[223426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 10:15:10 compute-0 sudo[223426]: pam_unix(sudo:session): session closed for user root
Nov 22 10:15:10 compute-0 sshd-session[223425]: Received disconnect from 192.168.122.10 port 37412:11: disconnected by user
Nov 22 10:15:10 compute-0 sshd-session[223425]: Disconnected from user zuul 192.168.122.10 port 37412
Nov 22 10:15:10 compute-0 sshd-session[223422]: pam_unix(sshd:session): session closed for user zuul
Nov 22 10:15:10 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Nov 22 10:15:10 compute-0 systemd-logind[819]: Session 27 logged out. Waiting for processes to exit.
Nov 22 10:15:10 compute-0 systemd-logind[819]: Removed session 27.
Nov 22 10:15:10 compute-0 sshd-session[223451]: Accepted publickey for zuul from 192.168.122.10 port 37424 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 10:15:10 compute-0 systemd-logind[819]: New session 28 of user zuul.
Nov 22 10:15:10 compute-0 systemd[1]: Started Session 28 of User zuul.
Nov 22 10:15:10 compute-0 sshd-session[223451]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 10:15:10 compute-0 sudo[223455]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Nov 22 10:15:10 compute-0 sudo[223455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 10:15:10 compute-0 sudo[223455]: pam_unix(sudo:session): session closed for user root
Nov 22 10:15:10 compute-0 sshd-session[223454]: Received disconnect from 192.168.122.10 port 37424:11: disconnected by user
Nov 22 10:15:10 compute-0 sshd-session[223454]: Disconnected from user zuul 192.168.122.10 port 37424
Nov 22 10:15:10 compute-0 sshd-session[223451]: pam_unix(sshd:session): session closed for user zuul
Nov 22 10:15:10 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Nov 22 10:15:10 compute-0 systemd-logind[819]: Session 28 logged out. Waiting for processes to exit.
Nov 22 10:15:10 compute-0 systemd-logind[819]: Removed session 28.
Nov 22 10:15:11 compute-0 nova_compute[186981]: 2025-11-22 10:15:11.804 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:12 compute-0 nova_compute[186981]: 2025-11-22 10:15:12.257 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:14 compute-0 podman[223480]: 2025-11-22 10:15:14.615469174 +0000 UTC m=+0.063083918 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 10:15:14 compute-0 podman[223481]: 2025-11-22 10:15:14.630891074 +0000 UTC m=+0.084278116 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:15:16 compute-0 nova_compute[186981]: 2025-11-22 10:15:16.803 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:17 compute-0 nova_compute[186981]: 2025-11-22 10:15:17.258 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:15:17.944 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:15:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:15:17.945 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:15:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:15:17.945 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:15:19 compute-0 nova_compute[186981]: 2025-11-22 10:15:19.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:19 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 10:15:20 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 10:15:20 compute-0 nova_compute[186981]: 2025-11-22 10:15:20.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:20 compute-0 nova_compute[186981]: 2025-11-22 10:15:20.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:15:20 compute-0 nova_compute[186981]: 2025-11-22 10:15:20.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:15:20 compute-0 nova_compute[186981]: 2025-11-22 10:15:20.613 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:15:20 compute-0 nova_compute[186981]: 2025-11-22 10:15:20.613 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.630 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.630 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.630 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.630 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.767 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.768 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5648MB free_disk=73.45566940307617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.768 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.768 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.806 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.833 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.834 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.862 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.876 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.877 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:15:21 compute-0 nova_compute[186981]: 2025-11-22 10:15:21.877 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:15:22 compute-0 nova_compute[186981]: 2025-11-22 10:15:22.260 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:22 compute-0 podman[223533]: 2025-11-22 10:15:22.6077807 +0000 UTC m=+0.054662889 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, name=ubi9-minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350)
Nov 22 10:15:22 compute-0 podman[223532]: 2025-11-22 10:15:22.625513773 +0000 UTC m=+0.065113744 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 10:15:22 compute-0 nova_compute[186981]: 2025-11-22 10:15:22.876 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:22 compute-0 nova_compute[186981]: 2025-11-22 10:15:22.877 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:15:23 compute-0 nova_compute[186981]: 2025-11-22 10:15:23.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:24 compute-0 podman[223573]: 2025-11-22 10:15:24.587161601 +0000 UTC m=+0.044249527 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:15:24 compute-0 nova_compute[186981]: 2025-11-22 10:15:24.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:24 compute-0 nova_compute[186981]: 2025-11-22 10:15:24.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:24 compute-0 podman[223574]: 2025-11-22 10:15:24.6102942 +0000 UTC m=+0.064312982 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 10:15:26 compute-0 nova_compute[186981]: 2025-11-22 10:15:26.588 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:15:26 compute-0 nova_compute[186981]: 2025-11-22 10:15:26.808 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:27 compute-0 nova_compute[186981]: 2025-11-22 10:15:27.262 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:31 compute-0 nova_compute[186981]: 2025-11-22 10:15:31.860 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:32 compute-0 nova_compute[186981]: 2025-11-22 10:15:32.263 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:33 compute-0 podman[223616]: 2025-11-22 10:15:33.629239875 +0000 UTC m=+0.073512633 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:15:36 compute-0 nova_compute[186981]: 2025-11-22 10:15:36.861 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:37 compute-0 nova_compute[186981]: 2025-11-22 10:15:37.266 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:41 compute-0 nova_compute[186981]: 2025-11-22 10:15:41.912 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:42 compute-0 nova_compute[186981]: 2025-11-22 10:15:42.268 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:45 compute-0 podman[223641]: 2025-11-22 10:15:45.600417129 +0000 UTC m=+0.056467769 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 10:15:45 compute-0 podman[223642]: 2025-11-22 10:15:45.661004448 +0000 UTC m=+0.103767687 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:15:46 compute-0 nova_compute[186981]: 2025-11-22 10:15:46.914 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:47 compute-0 nova_compute[186981]: 2025-11-22 10:15:47.341 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:51 compute-0 nova_compute[186981]: 2025-11-22 10:15:51.917 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:52 compute-0 nova_compute[186981]: 2025-11-22 10:15:52.343 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:53 compute-0 podman[223687]: 2025-11-22 10:15:53.615736353 +0000 UTC m=+0.064026095 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:15:53 compute-0 podman[223688]: 2025-11-22 10:15:53.645302817 +0000 UTC m=+0.085419726 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 10:15:55 compute-0 podman[223728]: 2025-11-22 10:15:55.604210071 +0000 UTC m=+0.060156190 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 10:15:55 compute-0 podman[223729]: 2025-11-22 10:15:55.624607656 +0000 UTC m=+0.072951298 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:15:56 compute-0 nova_compute[186981]: 2025-11-22 10:15:56.919 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:15:57 compute-0 nova_compute[186981]: 2025-11-22 10:15:57.345 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:01 compute-0 nova_compute[186981]: 2025-11-22 10:16:01.923 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:02 compute-0 nova_compute[186981]: 2025-11-22 10:16:02.348 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:04 compute-0 podman[223772]: 2025-11-22 10:16:04.627496651 +0000 UTC m=+0.067782147 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 10:16:06 compute-0 nova_compute[186981]: 2025-11-22 10:16:06.922 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:07 compute-0 nova_compute[186981]: 2025-11-22 10:16:07.351 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:11 compute-0 nova_compute[186981]: 2025-11-22 10:16:11.922 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:12 compute-0 nova_compute[186981]: 2025-11-22 10:16:12.352 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:16 compute-0 podman[223798]: 2025-11-22 10:16:16.644548334 +0000 UTC m=+0.104068885 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 10:16:16 compute-0 podman[223799]: 2025-11-22 10:16:16.705587746 +0000 UTC m=+0.154274162 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:16:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:16:16 compute-0 nova_compute[186981]: 2025-11-22 10:16:16.924 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:17 compute-0 nova_compute[186981]: 2025-11-22 10:16:17.353 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:16:17.946 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:16:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:16:17.946 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:16:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:16:17.947 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:16:19 compute-0 nova_compute[186981]: 2025-11-22 10:16:19.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:20 compute-0 nova_compute[186981]: 2025-11-22 10:16:20.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.619 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.620 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.649 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.649 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.650 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.650 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.828 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.829 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.45571899414062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.829 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.830 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.926 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.938 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.939 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.976 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.992 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.994 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:16:21 compute-0 nova_compute[186981]: 2025-11-22 10:16:21.994 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:16:22 compute-0 nova_compute[186981]: 2025-11-22 10:16:22.354 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:22 compute-0 nova_compute[186981]: 2025-11-22 10:16:22.968 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:23 compute-0 nova_compute[186981]: 2025-11-22 10:16:23.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:24 compute-0 nova_compute[186981]: 2025-11-22 10:16:24.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:24 compute-0 nova_compute[186981]: 2025-11-22 10:16:24.593 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:16:24 compute-0 podman[223846]: 2025-11-22 10:16:24.638577588 +0000 UTC m=+0.093768014 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 10:16:24 compute-0 podman[223847]: 2025-11-22 10:16:24.647008568 +0000 UTC m=+0.086308132 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:16:26 compute-0 nova_compute[186981]: 2025-11-22 10:16:26.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:26 compute-0 nova_compute[186981]: 2025-11-22 10:16:26.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:16:26 compute-0 podman[223887]: 2025-11-22 10:16:26.626115701 +0000 UTC m=+0.076580007 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:16:26 compute-0 podman[223888]: 2025-11-22 10:16:26.642420944 +0000 UTC m=+0.088598113 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 10:16:26 compute-0 nova_compute[186981]: 2025-11-22 10:16:26.956 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:27 compute-0 nova_compute[186981]: 2025-11-22 10:16:27.355 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:31 compute-0 nova_compute[186981]: 2025-11-22 10:16:31.960 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:32 compute-0 nova_compute[186981]: 2025-11-22 10:16:32.356 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:35 compute-0 podman[223931]: 2025-11-22 10:16:35.583361034 +0000 UTC m=+0.043469665 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:16:36 compute-0 nova_compute[186981]: 2025-11-22 10:16:36.962 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:37 compute-0 nova_compute[186981]: 2025-11-22 10:16:37.358 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:41 compute-0 nova_compute[186981]: 2025-11-22 10:16:41.965 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:42 compute-0 nova_compute[186981]: 2025-11-22 10:16:42.362 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:47 compute-0 nova_compute[186981]: 2025-11-22 10:16:47.007 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:47 compute-0 nova_compute[186981]: 2025-11-22 10:16:47.364 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:47 compute-0 podman[223956]: 2025-11-22 10:16:47.627268488 +0000 UTC m=+0.082956629 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:16:47 compute-0 podman[223957]: 2025-11-22 10:16:47.668691566 +0000 UTC m=+0.121430307 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 10:16:52 compute-0 nova_compute[186981]: 2025-11-22 10:16:52.009 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:52 compute-0 nova_compute[186981]: 2025-11-22 10:16:52.366 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:55 compute-0 podman[224000]: 2025-11-22 10:16:55.610424966 +0000 UTC m=+0.060456678 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:16:55 compute-0 podman[224001]: 2025-11-22 10:16:55.611430373 +0000 UTC m=+0.061711252 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Nov 22 10:16:57 compute-0 nova_compute[186981]: 2025-11-22 10:16:57.060 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:57 compute-0 nova_compute[186981]: 2025-11-22 10:16:57.368 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:16:57 compute-0 podman[224038]: 2025-11-22 10:16:57.623092872 +0000 UTC m=+0.074145649 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:16:57 compute-0 podman[224039]: 2025-11-22 10:16:57.637157375 +0000 UTC m=+0.083225376 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:17:02 compute-0 nova_compute[186981]: 2025-11-22 10:17:02.061 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:02 compute-0 nova_compute[186981]: 2025-11-22 10:17:02.369 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:06 compute-0 podman[224080]: 2025-11-22 10:17:06.604979177 +0000 UTC m=+0.063495680 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:17:07 compute-0 nova_compute[186981]: 2025-11-22 10:17:07.063 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:07 compute-0 nova_compute[186981]: 2025-11-22 10:17:07.370 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:12 compute-0 nova_compute[186981]: 2025-11-22 10:17:12.064 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:12 compute-0 nova_compute[186981]: 2025-11-22 10:17:12.371 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:17 compute-0 nova_compute[186981]: 2025-11-22 10:17:17.066 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:17 compute-0 nova_compute[186981]: 2025-11-22 10:17:17.374 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:17:17.946 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:17:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:17:17.947 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:17:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:17:17.947 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:17:18 compute-0 podman[224104]: 2025-11-22 10:17:18.630716598 +0000 UTC m=+0.081509171 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 10:17:18 compute-0 podman[224105]: 2025-11-22 10:17:18.650403724 +0000 UTC m=+0.107540829 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 10:17:19 compute-0 nova_compute[186981]: 2025-11-22 10:17:19.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:21 compute-0 nova_compute[186981]: 2025-11-22 10:17:21.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:21 compute-0 nova_compute[186981]: 2025-11-22 10:17:21.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:17:21 compute-0 nova_compute[186981]: 2025-11-22 10:17:21.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:17:21 compute-0 nova_compute[186981]: 2025-11-22 10:17:21.617 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:17:22 compute-0 nova_compute[186981]: 2025-11-22 10:17:22.068 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:22 compute-0 nova_compute[186981]: 2025-11-22 10:17:22.377 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:22 compute-0 nova_compute[186981]: 2025-11-22 10:17:22.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.629 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.629 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.630 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.630 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.773 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.775 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.45571899414062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.775 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:17:23 compute-0 nova_compute[186981]: 2025-11-22 10:17:23.775 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.026 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.027 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.369 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.499 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.501 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.501 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.502 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.502 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 10:17:24 compute-0 nova_compute[186981]: 2025-11-22 10:17:24.526 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 10:17:25 compute-0 nova_compute[186981]: 2025-11-22 10:17:25.525 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:25 compute-0 nova_compute[186981]: 2025-11-22 10:17:25.526 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:25 compute-0 nova_compute[186981]: 2025-11-22 10:17:25.526 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:17:26 compute-0 podman[224150]: 2025-11-22 10:17:26.594782554 +0000 UTC m=+0.050711891 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 10:17:26 compute-0 podman[224151]: 2025-11-22 10:17:26.627985169 +0000 UTC m=+0.067519120 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7)
Nov 22 10:17:27 compute-0 nova_compute[186981]: 2025-11-22 10:17:27.104 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:27 compute-0 nova_compute[186981]: 2025-11-22 10:17:27.379 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:27 compute-0 nova_compute[186981]: 2025-11-22 10:17:27.406 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:27 compute-0 nova_compute[186981]: 2025-11-22 10:17:27.611 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:28 compute-0 podman[224191]: 2025-11-22 10:17:28.587310913 +0000 UTC m=+0.046922459 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 10:17:28 compute-0 nova_compute[186981]: 2025-11-22 10:17:28.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:28 compute-0 podman[224192]: 2025-11-22 10:17:28.602331412 +0000 UTC m=+0.054035843 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 10:17:30 compute-0 nova_compute[186981]: 2025-11-22 10:17:30.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:30 compute-0 nova_compute[186981]: 2025-11-22 10:17:30.612 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:32 compute-0 nova_compute[186981]: 2025-11-22 10:17:32.106 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:32 compute-0 nova_compute[186981]: 2025-11-22 10:17:32.380 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:33 compute-0 nova_compute[186981]: 2025-11-22 10:17:33.608 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:17:33 compute-0 nova_compute[186981]: 2025-11-22 10:17:33.609 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 10:17:37 compute-0 nova_compute[186981]: 2025-11-22 10:17:37.109 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:37 compute-0 nova_compute[186981]: 2025-11-22 10:17:37.381 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:37 compute-0 podman[224235]: 2025-11-22 10:17:37.626786445 +0000 UTC m=+0.085616102 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:17:42 compute-0 nova_compute[186981]: 2025-11-22 10:17:42.109 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:42 compute-0 nova_compute[186981]: 2025-11-22 10:17:42.381 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:47 compute-0 nova_compute[186981]: 2025-11-22 10:17:47.114 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:47 compute-0 nova_compute[186981]: 2025-11-22 10:17:47.382 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:49 compute-0 podman[224259]: 2025-11-22 10:17:49.622742395 +0000 UTC m=+0.072585158 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 10:17:49 compute-0 podman[224260]: 2025-11-22 10:17:49.652878115 +0000 UTC m=+0.093686522 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:17:52 compute-0 nova_compute[186981]: 2025-11-22 10:17:52.117 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:52 compute-0 nova_compute[186981]: 2025-11-22 10:17:52.384 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:57 compute-0 nova_compute[186981]: 2025-11-22 10:17:57.153 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:57 compute-0 nova_compute[186981]: 2025-11-22 10:17:57.386 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:17:57 compute-0 podman[224305]: 2025-11-22 10:17:57.615400882 +0000 UTC m=+0.066787160 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 10:17:57 compute-0 podman[224306]: 2025-11-22 10:17:57.650407385 +0000 UTC m=+0.088879352 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 10:17:59 compute-0 podman[224343]: 2025-11-22 10:17:59.63948915 +0000 UTC m=+0.080660568 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:17:59 compute-0 podman[224344]: 2025-11-22 10:17:59.643477819 +0000 UTC m=+0.079072345 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 10:18:02 compute-0 nova_compute[186981]: 2025-11-22 10:18:02.154 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:02 compute-0 nova_compute[186981]: 2025-11-22 10:18:02.388 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:06 compute-0 nova_compute[186981]: 2025-11-22 10:18:06.663 186985 DEBUG oslo_concurrency.processutils [None req-c2d191f3-ec7b-4a0d-afb8-b22e0d81d69b c4931bea570642819b92d3f70cfdb07b b797995ce7e2414bb591227b83fccf41 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 10:18:06 compute-0 nova_compute[186981]: 2025-11-22 10:18:06.688 186985 DEBUG oslo_concurrency.processutils [None req-c2d191f3-ec7b-4a0d-afb8-b22e0d81d69b c4931bea570642819b92d3f70cfdb07b b797995ce7e2414bb591227b83fccf41 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 10:18:07 compute-0 nova_compute[186981]: 2025-11-22 10:18:07.157 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:07 compute-0 nova_compute[186981]: 2025-11-22 10:18:07.390 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:08 compute-0 podman[224388]: 2025-11-22 10:18:08.643663231 +0000 UTC m=+0.085195301 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:18:12 compute-0 nova_compute[186981]: 2025-11-22 10:18:12.158 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:12 compute-0 nova_compute[186981]: 2025-11-22 10:18:12.378 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:12 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:18:12.379 104216 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '92:4b:82', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'e6:9b:23:63:8a:4a'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 10:18:12 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:18:12.381 104216 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 10:18:12 compute-0 nova_compute[186981]: 2025-11-22 10:18:12.391 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:18:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:18:17 compute-0 nova_compute[186981]: 2025-11-22 10:18:17.220 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:17 compute-0 nova_compute[186981]: 2025-11-22 10:18:17.393 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:18:17.948 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:18:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:18:17.948 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:18:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:18:17.949 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:18:19 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:18:19.384 104216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f6533837-2723-4772-a9db-3c9eeea0db5c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 10:18:20 compute-0 nova_compute[186981]: 2025-11-22 10:18:20.619 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:20 compute-0 podman[224413]: 2025-11-22 10:18:20.701530157 +0000 UTC m=+0.152596177 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 10:18:20 compute-0 podman[224412]: 2025-11-22 10:18:20.705085983 +0000 UTC m=+0.150877309 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 10:18:22 compute-0 nova_compute[186981]: 2025-11-22 10:18:22.221 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:22 compute-0 nova_compute[186981]: 2025-11-22 10:18:22.395 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:22 compute-0 nova_compute[186981]: 2025-11-22 10:18:22.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:22 compute-0 nova_compute[186981]: 2025-11-22 10:18:22.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:18:22 compute-0 nova_compute[186981]: 2025-11-22 10:18:22.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:18:22 compute-0 nova_compute[186981]: 2025-11-22 10:18:22.617 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:18:23 compute-0 nova_compute[186981]: 2025-11-22 10:18:23.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:24 compute-0 nova_compute[186981]: 2025-11-22 10:18:24.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:24 compute-0 nova_compute[186981]: 2025-11-22 10:18:24.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.644 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.644 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.644 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.645 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.802 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.804 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5742MB free_disk=73.45571899414062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.804 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.804 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.881 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.882 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:18:25 compute-0 nova_compute[186981]: 2025-11-22 10:18:25.986 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing inventories for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.029 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating ProviderTree inventory for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.030 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Updating inventory in ProviderTree for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.069 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing aggregate associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.100 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Refreshing trait associations for resource provider dd02da68-d6c7-4f1a-8710-21abb7ad1703, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE2,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE4A,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.121 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.136 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.139 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:18:26 compute-0 nova_compute[186981]: 2025-11-22 10:18:26.140 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:18:27 compute-0 nova_compute[186981]: 2025-11-22 10:18:27.222 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:27 compute-0 nova_compute[186981]: 2025-11-22 10:18:27.396 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:28 compute-0 nova_compute[186981]: 2025-11-22 10:18:28.141 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:28 compute-0 nova_compute[186981]: 2025-11-22 10:18:28.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:18:28 compute-0 podman[224458]: 2025-11-22 10:18:28.592381709 +0000 UTC m=+0.049898078 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 22 10:18:28 compute-0 podman[224459]: 2025-11-22 10:18:28.608598721 +0000 UTC m=+0.058408251 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 10:18:30 compute-0 podman[224498]: 2025-11-22 10:18:30.58645003 +0000 UTC m=+0.047550676 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 10:18:30 compute-0 podman[224499]: 2025-11-22 10:18:30.592413422 +0000 UTC m=+0.049101508 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 10:18:32 compute-0 nova_compute[186981]: 2025-11-22 10:18:32.223 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:32 compute-0 nova_compute[186981]: 2025-11-22 10:18:32.398 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:37 compute-0 nova_compute[186981]: 2025-11-22 10:18:37.270 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:37 compute-0 nova_compute[186981]: 2025-11-22 10:18:37.400 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:39 compute-0 podman[224538]: 2025-11-22 10:18:39.595467632 +0000 UTC m=+0.052375838 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 10:18:42 compute-0 nova_compute[186981]: 2025-11-22 10:18:42.271 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:42 compute-0 nova_compute[186981]: 2025-11-22 10:18:42.401 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:47 compute-0 nova_compute[186981]: 2025-11-22 10:18:47.310 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:47 compute-0 nova_compute[186981]: 2025-11-22 10:18:47.403 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:51 compute-0 podman[224562]: 2025-11-22 10:18:51.628597653 +0000 UTC m=+0.075780626 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 10:18:51 compute-0 podman[224563]: 2025-11-22 10:18:51.637505585 +0000 UTC m=+0.093002794 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 10:18:52 compute-0 nova_compute[186981]: 2025-11-22 10:18:52.312 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:52 compute-0 nova_compute[186981]: 2025-11-22 10:18:52.404 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:57 compute-0 nova_compute[186981]: 2025-11-22 10:18:57.313 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:57 compute-0 nova_compute[186981]: 2025-11-22 10:18:57.405 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:18:59 compute-0 podman[224608]: 2025-11-22 10:18:59.607237518 +0000 UTC m=+0.058870716 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64)
Nov 22 10:18:59 compute-0 podman[224607]: 2025-11-22 10:18:59.616377107 +0000 UTC m=+0.072667072 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 10:19:01 compute-0 podman[224644]: 2025-11-22 10:19:01.627976461 +0000 UTC m=+0.078743728 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 10:19:01 compute-0 podman[224643]: 2025-11-22 10:19:01.629644427 +0000 UTC m=+0.078153422 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:19:02 compute-0 nova_compute[186981]: 2025-11-22 10:19:02.315 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:02 compute-0 nova_compute[186981]: 2025-11-22 10:19:02.407 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:07 compute-0 nova_compute[186981]: 2025-11-22 10:19:07.348 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:07 compute-0 nova_compute[186981]: 2025-11-22 10:19:07.408 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:10 compute-0 podman[224688]: 2025-11-22 10:19:10.63539437 +0000 UTC m=+0.088443281 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:19:12 compute-0 nova_compute[186981]: 2025-11-22 10:19:12.351 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:12 compute-0 nova_compute[186981]: 2025-11-22 10:19:12.409 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:17 compute-0 nova_compute[186981]: 2025-11-22 10:19:17.351 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:17 compute-0 nova_compute[186981]: 2025-11-22 10:19:17.410 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:19:17.949 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:19:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:19:17.950 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:19:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:19:17.950 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:19:22 compute-0 nova_compute[186981]: 2025-11-22 10:19:22.354 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:22 compute-0 nova_compute[186981]: 2025-11-22 10:19:22.412 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:22 compute-0 nova_compute[186981]: 2025-11-22 10:19:22.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:22 compute-0 nova_compute[186981]: 2025-11-22 10:19:22.593 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:19:22 compute-0 nova_compute[186981]: 2025-11-22 10:19:22.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:19:22 compute-0 podman[224714]: 2025-11-22 10:19:22.602459562 +0000 UTC m=+0.058180427 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 10:19:22 compute-0 nova_compute[186981]: 2025-11-22 10:19:22.611 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:19:22 compute-0 nova_compute[186981]: 2025-11-22 10:19:22.613 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:22 compute-0 podman[224715]: 2025-11-22 10:19:22.634343152 +0000 UTC m=+0.089378528 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 10:19:24 compute-0 nova_compute[186981]: 2025-11-22 10:19:24.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:25 compute-0 nova_compute[186981]: 2025-11-22 10:19:25.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:25 compute-0 nova_compute[186981]: 2025-11-22 10:19:25.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:25 compute-0 nova_compute[186981]: 2025-11-22 10:19:25.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:25 compute-0 nova_compute[186981]: 2025-11-22 10:19:25.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.632 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.633 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.634 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.634 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.836 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.837 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5734MB free_disk=73.45571899414062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.837 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.838 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.926 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.926 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.952 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.969 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.972 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:19:26 compute-0 nova_compute[186981]: 2025-11-22 10:19:26.973 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:19:27 compute-0 nova_compute[186981]: 2025-11-22 10:19:27.355 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:27 compute-0 nova_compute[186981]: 2025-11-22 10:19:27.413 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:27 compute-0 nova_compute[186981]: 2025-11-22 10:19:27.975 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:30 compute-0 nova_compute[186981]: 2025-11-22 10:19:30.588 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:30 compute-0 podman[224759]: 2025-11-22 10:19:30.660335354 +0000 UTC m=+0.098309922 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, config_id=edpm)
Nov 22 10:19:30 compute-0 podman[224758]: 2025-11-22 10:19:30.672010402 +0000 UTC m=+0.114250006 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 22 10:19:32 compute-0 nova_compute[186981]: 2025-11-22 10:19:32.357 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:32 compute-0 nova_compute[186981]: 2025-11-22 10:19:32.414 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:32 compute-0 podman[224797]: 2025-11-22 10:19:32.607654196 +0000 UTC m=+0.064506850 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:19:32 compute-0 podman[224798]: 2025-11-22 10:19:32.635902216 +0000 UTC m=+0.081731189 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Nov 22 10:19:33 compute-0 nova_compute[186981]: 2025-11-22 10:19:33.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:19:37 compute-0 nova_compute[186981]: 2025-11-22 10:19:37.399 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:37 compute-0 nova_compute[186981]: 2025-11-22 10:19:37.415 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:41 compute-0 podman[224839]: 2025-11-22 10:19:41.636610663 +0000 UTC m=+0.077677579 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:19:42 compute-0 nova_compute[186981]: 2025-11-22 10:19:42.400 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:42 compute-0 nova_compute[186981]: 2025-11-22 10:19:42.417 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:47 compute-0 nova_compute[186981]: 2025-11-22 10:19:47.400 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:47 compute-0 nova_compute[186981]: 2025-11-22 10:19:47.418 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:52 compute-0 nova_compute[186981]: 2025-11-22 10:19:52.403 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:52 compute-0 nova_compute[186981]: 2025-11-22 10:19:52.419 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:53 compute-0 podman[224863]: 2025-11-22 10:19:53.627176096 +0000 UTC m=+0.082132880 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:19:53 compute-0 podman[224864]: 2025-11-22 10:19:53.659131397 +0000 UTC m=+0.103696748 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 10:19:57 compute-0 nova_compute[186981]: 2025-11-22 10:19:57.405 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:19:57 compute-0 nova_compute[186981]: 2025-11-22 10:19:57.420 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:01 compute-0 podman[224910]: 2025-11-22 10:20:01.630487379 +0000 UTC m=+0.076658502 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 10:20:01 compute-0 podman[224911]: 2025-11-22 10:20:01.650200607 +0000 UTC m=+0.101519250 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:20:02 compute-0 nova_compute[186981]: 2025-11-22 10:20:02.408 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:02 compute-0 nova_compute[186981]: 2025-11-22 10:20:02.422 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:03 compute-0 podman[224950]: 2025-11-22 10:20:03.616508897 +0000 UTC m=+0.066141895 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:20:03 compute-0 podman[224951]: 2025-11-22 10:20:03.625557303 +0000 UTC m=+0.072461797 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:20:07 compute-0 nova_compute[186981]: 2025-11-22 10:20:07.423 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:07 compute-0 nova_compute[186981]: 2025-11-22 10:20:07.425 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:07 compute-0 nova_compute[186981]: 2025-11-22 10:20:07.425 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:20:07 compute-0 nova_compute[186981]: 2025-11-22 10:20:07.426 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:07 compute-0 nova_compute[186981]: 2025-11-22 10:20:07.466 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:07 compute-0 nova_compute[186981]: 2025-11-22 10:20:07.466 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:12 compute-0 nova_compute[186981]: 2025-11-22 10:20:12.468 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:12 compute-0 nova_compute[186981]: 2025-11-22 10:20:12.469 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:12 compute-0 nova_compute[186981]: 2025-11-22 10:20:12.469 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:20:12 compute-0 nova_compute[186981]: 2025-11-22 10:20:12.469 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:12 compute-0 nova_compute[186981]: 2025-11-22 10:20:12.469 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:12 compute-0 nova_compute[186981]: 2025-11-22 10:20:12.470 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:12 compute-0 podman[224994]: 2025-11-22 10:20:12.611055986 +0000 UTC m=+0.056614955 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:20:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:20:17 compute-0 nova_compute[186981]: 2025-11-22 10:20:17.470 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:17 compute-0 nova_compute[186981]: 2025-11-22 10:20:17.471 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:20:17.951 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:20:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:20:17.951 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:20:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:20:17.951 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:20:22 compute-0 nova_compute[186981]: 2025-11-22 10:20:22.472 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:22 compute-0 nova_compute[186981]: 2025-11-22 10:20:22.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:22 compute-0 nova_compute[186981]: 2025-11-22 10:20:22.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:20:22 compute-0 nova_compute[186981]: 2025-11-22 10:20:22.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:20:22 compute-0 nova_compute[186981]: 2025-11-22 10:20:22.615 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:20:24 compute-0 nova_compute[186981]: 2025-11-22 10:20:24.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:24 compute-0 nova_compute[186981]: 2025-11-22 10:20:24.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:24 compute-0 podman[225018]: 2025-11-22 10:20:24.638155953 +0000 UTC m=+0.082089929 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 10:20:24 compute-0 podman[225019]: 2025-11-22 10:20:24.72163706 +0000 UTC m=+0.161225878 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 10:20:25 compute-0 nova_compute[186981]: 2025-11-22 10:20:25.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.628 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.629 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.630 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.630 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.843 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.844 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5733MB free_disk=73.45584106445312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.845 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.845 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.909 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.909 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.936 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.952 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.953 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:20:26 compute-0 nova_compute[186981]: 2025-11-22 10:20:26.954 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:20:27 compute-0 nova_compute[186981]: 2025-11-22 10:20:27.474 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:27 compute-0 nova_compute[186981]: 2025-11-22 10:20:27.954 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:27 compute-0 nova_compute[186981]: 2025-11-22 10:20:27.954 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:20:29 compute-0 nova_compute[186981]: 2025-11-22 10:20:29.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:30 compute-0 nova_compute[186981]: 2025-11-22 10:20:30.588 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:20:32 compute-0 nova_compute[186981]: 2025-11-22 10:20:32.476 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:32 compute-0 podman[225062]: 2025-11-22 10:20:32.588549524 +0000 UTC m=+0.046699935 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 10:20:32 compute-0 podman[225063]: 2025-11-22 10:20:32.610598345 +0000 UTC m=+0.060090960 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal)
Nov 22 10:20:34 compute-0 podman[225100]: 2025-11-22 10:20:34.630421944 +0000 UTC m=+0.076276441 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 10:20:34 compute-0 podman[225101]: 2025-11-22 10:20:34.638553466 +0000 UTC m=+0.082572093 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 10:20:37 compute-0 nova_compute[186981]: 2025-11-22 10:20:37.479 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:37 compute-0 nova_compute[186981]: 2025-11-22 10:20:37.481 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:37 compute-0 nova_compute[186981]: 2025-11-22 10:20:37.481 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:20:37 compute-0 nova_compute[186981]: 2025-11-22 10:20:37.482 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:37 compute-0 nova_compute[186981]: 2025-11-22 10:20:37.488 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:37 compute-0 nova_compute[186981]: 2025-11-22 10:20:37.489 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:42 compute-0 nova_compute[186981]: 2025-11-22 10:20:42.490 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:43 compute-0 podman[225144]: 2025-11-22 10:20:43.58565095 +0000 UTC m=+0.045060609 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:20:47 compute-0 nova_compute[186981]: 2025-11-22 10:20:47.491 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:52 compute-0 nova_compute[186981]: 2025-11-22 10:20:52.493 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:20:52 compute-0 nova_compute[186981]: 2025-11-22 10:20:52.496 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:52 compute-0 nova_compute[186981]: 2025-11-22 10:20:52.497 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:20:52 compute-0 nova_compute[186981]: 2025-11-22 10:20:52.497 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:52 compute-0 nova_compute[186981]: 2025-11-22 10:20:52.498 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:20:52 compute-0 nova_compute[186981]: 2025-11-22 10:20:52.500 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:55 compute-0 podman[225169]: 2025-11-22 10:20:55.597384021 +0000 UTC m=+0.050787435 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 10:20:55 compute-0 podman[225170]: 2025-11-22 10:20:55.664533882 +0000 UTC m=+0.109845476 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 22 10:20:57 compute-0 nova_compute[186981]: 2025-11-22 10:20:57.497 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:20:57 compute-0 nova_compute[186981]: 2025-11-22 10:20:57.500 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:02 compute-0 nova_compute[186981]: 2025-11-22 10:21:02.499 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:02 compute-0 nova_compute[186981]: 2025-11-22 10:21:02.502 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:03 compute-0 podman[225217]: 2025-11-22 10:21:03.599499602 +0000 UTC m=+0.052693178 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 10:21:03 compute-0 podman[225218]: 2025-11-22 10:21:03.617023719 +0000 UTC m=+0.067011028 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9)
Nov 22 10:21:05 compute-0 podman[225256]: 2025-11-22 10:21:05.629527018 +0000 UTC m=+0.076613120 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:21:05 compute-0 podman[225257]: 2025-11-22 10:21:05.658140309 +0000 UTC m=+0.105832917 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 10:21:07 compute-0 nova_compute[186981]: 2025-11-22 10:21:07.503 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:07 compute-0 nova_compute[186981]: 2025-11-22 10:21:07.505 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:07 compute-0 nova_compute[186981]: 2025-11-22 10:21:07.505 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:21:07 compute-0 nova_compute[186981]: 2025-11-22 10:21:07.505 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:07 compute-0 nova_compute[186981]: 2025-11-22 10:21:07.552 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:07 compute-0 nova_compute[186981]: 2025-11-22 10:21:07.552 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:12 compute-0 nova_compute[186981]: 2025-11-22 10:21:12.553 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:12 compute-0 nova_compute[186981]: 2025-11-22 10:21:12.554 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:12 compute-0 nova_compute[186981]: 2025-11-22 10:21:12.554 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:21:12 compute-0 nova_compute[186981]: 2025-11-22 10:21:12.554 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:12 compute-0 nova_compute[186981]: 2025-11-22 10:21:12.555 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:12 compute-0 nova_compute[186981]: 2025-11-22 10:21:12.555 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:14 compute-0 podman[225299]: 2025-11-22 10:21:14.593683078 +0000 UTC m=+0.044840354 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 10:21:17 compute-0 nova_compute[186981]: 2025-11-22 10:21:17.556 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:17 compute-0 nova_compute[186981]: 2025-11-22 10:21:17.558 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:21:17.952 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:21:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:21:17.952 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:21:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:21:17.953 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:21:22 compute-0 nova_compute[186981]: 2025-11-22 10:21:22.557 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:22 compute-0 nova_compute[186981]: 2025-11-22 10:21:22.558 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:23 compute-0 nova_compute[186981]: 2025-11-22 10:21:23.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:23 compute-0 nova_compute[186981]: 2025-11-22 10:21:23.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:21:23 compute-0 nova_compute[186981]: 2025-11-22 10:21:23.595 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:21:23 compute-0 nova_compute[186981]: 2025-11-22 10:21:23.617 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 10:21:24 compute-0 nova_compute[186981]: 2025-11-22 10:21:24.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:25 compute-0 nova_compute[186981]: 2025-11-22 10:21:25.595 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:26 compute-0 nova_compute[186981]: 2025-11-22 10:21:26.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:26 compute-0 nova_compute[186981]: 2025-11-22 10:21:26.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:26 compute-0 podman[225323]: 2025-11-22 10:21:26.638777188 +0000 UTC m=+0.082717947 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 22 10:21:26 compute-0 podman[225324]: 2025-11-22 10:21:26.67041002 +0000 UTC m=+0.110926345 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.560 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.618 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.619 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.619 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.619 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.801 186985 WARNING nova.virt.libvirt.driver [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.802 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5734MB free_disk=73.45486450195312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.802 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.803 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.855 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.855 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.887 186985 DEBUG nova.compute.provider_tree [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed in ProviderTree for provider: dd02da68-d6c7-4f1a-8710-21abb7ad1703 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.900 186985 DEBUG nova.scheduler.client.report [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Inventory has not changed for provider dd02da68-d6c7-4f1a-8710-21abb7ad1703 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.902 186985 DEBUG nova.compute.resource_tracker [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 10:21:27 compute-0 nova_compute[186981]: 2025-11-22 10:21:27.902 186985 DEBUG oslo_concurrency.lockutils [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:21:29 compute-0 nova_compute[186981]: 2025-11-22 10:21:29.903 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:29 compute-0 nova_compute[186981]: 2025-11-22 10:21:29.903 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 10:21:30 compute-0 nova_compute[186981]: 2025-11-22 10:21:30.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:31 compute-0 nova_compute[186981]: 2025-11-22 10:21:31.594 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:32 compute-0 nova_compute[186981]: 2025-11-22 10:21:32.562 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:32 compute-0 nova_compute[186981]: 2025-11-22 10:21:32.562 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:32 compute-0 nova_compute[186981]: 2025-11-22 10:21:32.562 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:21:32 compute-0 nova_compute[186981]: 2025-11-22 10:21:32.563 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:32 compute-0 nova_compute[186981]: 2025-11-22 10:21:32.563 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:32 compute-0 nova_compute[186981]: 2025-11-22 10:21:32.564 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:34 compute-0 podman[225372]: 2025-11-22 10:21:34.612731711 +0000 UTC m=+0.058190707 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 22 10:21:34 compute-0 podman[225373]: 2025-11-22 10:21:34.627895775 +0000 UTC m=+0.063750479 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 10:21:36 compute-0 nova_compute[186981]: 2025-11-22 10:21:36.589 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:21:36 compute-0 podman[225410]: 2025-11-22 10:21:36.59552167 +0000 UTC m=+0.053477939 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:21:36 compute-0 podman[225411]: 2025-11-22 10:21:36.607678101 +0000 UTC m=+0.062748851 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 10:21:37 compute-0 nova_compute[186981]: 2025-11-22 10:21:37.565 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:42 compute-0 nova_compute[186981]: 2025-11-22 10:21:42.567 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:45 compute-0 podman[225451]: 2025-11-22 10:21:45.627642313 +0000 UTC m=+0.070458442 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 10:21:47 compute-0 nova_compute[186981]: 2025-11-22 10:21:47.569 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:47 compute-0 nova_compute[186981]: 2025-11-22 10:21:47.571 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:47 compute-0 nova_compute[186981]: 2025-11-22 10:21:47.572 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:21:47 compute-0 nova_compute[186981]: 2025-11-22 10:21:47.572 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:47 compute-0 nova_compute[186981]: 2025-11-22 10:21:47.572 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:21:47 compute-0 nova_compute[186981]: 2025-11-22 10:21:47.573 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:52 compute-0 nova_compute[186981]: 2025-11-22 10:21:52.574 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:21:52 compute-0 nova_compute[186981]: 2025-11-22 10:21:52.576 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:57 compute-0 nova_compute[186981]: 2025-11-22 10:21:57.577 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:57 compute-0 nova_compute[186981]: 2025-11-22 10:21:57.578 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:21:57 compute-0 podman[225476]: 2025-11-22 10:21:57.639330802 +0000 UTC m=+0.092490573 container health_status 378f01a2e9cf452a62fec0653e1c5b16f17825f89dedf5b7a0408ec0c6eaa325 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 10:21:57 compute-0 podman[225477]: 2025-11-22 10:21:57.684917474 +0000 UTC m=+0.123394235 container health_status e70cf8dca78ca0671f83d13c0b698622308c874299638feff9da2e9d731b1c6e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 10:22:02 compute-0 nova_compute[186981]: 2025-11-22 10:22:02.580 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:22:02 compute-0 nova_compute[186981]: 2025-11-22 10:22:02.581 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:22:02 compute-0 nova_compute[186981]: 2025-11-22 10:22:02.582 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:22:02 compute-0 nova_compute[186981]: 2025-11-22 10:22:02.582 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:22:02 compute-0 nova_compute[186981]: 2025-11-22 10:22:02.626 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:22:02 compute-0 nova_compute[186981]: 2025-11-22 10:22:02.627 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:22:05 compute-0 podman[225523]: 2025-11-22 10:22:05.600339232 +0000 UTC m=+0.053685394 container health_status ff15f44cf5a5d558d855f2ced0c563de027b1794db008a0bac340640dd7d7296 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, version=9.6, architecture=x86_64)
Nov 22 10:22:05 compute-0 podman[225522]: 2025-11-22 10:22:05.619403232 +0000 UTC m=+0.076085145 container health_status 6c62c361770cd3e0442e716fd284905ea1b8591af0acaa5e69ab0b5b4d5ef40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 22 10:22:07 compute-0 podman[225562]: 2025-11-22 10:22:07.608554405 +0000 UTC m=+0.059935946 container health_status 6864acaf6533f67d174653b726099b161aa5fdcbc29c0355dc0976e315db49fc (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 10:22:07 compute-0 podman[225563]: 2025-11-22 10:22:07.625505397 +0000 UTC m=+0.072311723 container health_status a707d0772ea2b7b2d1a87667b177f5f6a45b6a94579cef1bdbda0e96e08ca323 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 10:22:07 compute-0 nova_compute[186981]: 2025-11-22 10:22:07.627 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:22:11 compute-0 sshd-session[225604]: Accepted publickey for zuul from 192.168.122.10 port 57216 ssh2: ECDSA SHA256:18GbJLZV+buKX8xH+pakpcEWZrvO1QAxoPz6QGSQl+4
Nov 22 10:22:12 compute-0 systemd-logind[819]: New session 29 of user zuul.
Nov 22 10:22:12 compute-0 systemd[1]: Started Session 29 of User zuul.
Nov 22 10:22:12 compute-0 sshd-session[225604]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 10:22:12 compute-0 sudo[225608]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 22 10:22:12 compute-0 sudo[225608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 10:22:12 compute-0 nova_compute[186981]: 2025-11-22 10:22:12.630 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:22:16 compute-0 podman[225769]: 2025-11-22 10:22:16.603377452 +0000 UTC m=+0.057566801 container health_status 2513067a521a60ea91f06d53b858710f17e68bf3acc200346d151b52662bbe9b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 10:22:16 compute-0 ovs-vsctl[225801]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:16 compute-0 ceilometer_agent_compute[197674]: 2025-11-22 10:22:16.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 10:22:17 compute-0 virtqemud[186556]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 10:22:17 compute-0 virtqemud[186556]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 10:22:17 compute-0 nova_compute[186981]: 2025-11-22 10:22:17.698 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:22:17 compute-0 nova_compute[186981]: 2025-11-22 10:22:17.699 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:22:17 compute-0 nova_compute[186981]: 2025-11-22 10:22:17.699 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5068 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:22:17 compute-0 nova_compute[186981]: 2025-11-22 10:22:17.699 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:22:17 compute-0 nova_compute[186981]: 2025-11-22 10:22:17.699 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:22:17 compute-0 nova_compute[186981]: 2025-11-22 10:22:17.700 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:22:17 compute-0 virtqemud[186556]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 10:22:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:22:17.952 104216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 10:22:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:22:17.953 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 10:22:17 compute-0 ovn_metadata_agent[104211]: 2025-11-22 10:22:17.953 104216 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 10:22:18 compute-0 crontab[226220]: (root) LIST (root)
Nov 22 10:22:20 compute-0 systemd[1]: Starting Hostname Service...
Nov 22 10:22:20 compute-0 systemd[1]: Started Hostname Service.
Nov 22 10:22:22 compute-0 nova_compute[186981]: 2025-11-22 10:22:22.702 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:22:22 compute-0 nova_compute[186981]: 2025-11-22 10:22:22.706 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 10:22:22 compute-0 nova_compute[186981]: 2025-11-22 10:22:22.707 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 22 10:22:22 compute-0 nova_compute[186981]: 2025-11-22 10:22:22.707 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:22:22 compute-0 nova_compute[186981]: 2025-11-22 10:22:22.768 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 10:22:22 compute-0 nova_compute[186981]: 2025-11-22 10:22:22.769 186985 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 10:22:23 compute-0 nova_compute[186981]: 2025-11-22 10:22:23.593 186985 DEBUG oslo_service.periodic_task [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 10:22:23 compute-0 nova_compute[186981]: 2025-11-22 10:22:23.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 10:22:23 compute-0 nova_compute[186981]: 2025-11-22 10:22:23.594 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 10:22:23 compute-0 nova_compute[186981]: 2025-11-22 10:22:23.624 186985 DEBUG nova.compute.manager [None req-4d6dd3bd-32f3-4281-8a39-93cc2978ffb6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
